Interactive Exhibition: User Journey

I created a user journey to annotate the users actions throughout the exhibition.

userjourney

George Enter’s The Museum

George walks into the museum, at the door he is handed a tablet.

He Signs In on The Tablet

He is asked to sign in via the ‘trawler’ application, if he has been here before he will have already got the ‘trawler’ application, and be able to directly sign in to the app.

If not there is a function that allows george to sign in via facebook or dropbox.

He Recieves an Email

Once signed in depending on if George has not been before, he will recieve a message ‘congratulating’ him, and then informing him that an email has been sent to his adress including a download link for the application, and files of any resources he shows interest in during his time at the exhibition.

The Maps are Activated

Once George closes down the message, the camera function on the device opens up and the augmented map is activated. George looks through the screen of the tablet, and aims it upwards.

He can then see marks on the screen, that indicate the locations of each object of interest at the museum.

George Uses The Map 

George can now use the map, when he clicks on a mark arrows come up to guide him through the building to the object of interest.

George Reaches The Object

When George reaches the object of interest, he notices a symbol with a sign to the side saying scan me. He points the camera toward the symbol, and the button to the bottom of the application lights up, and asks him to scan the symbol.

The Augmented Memory’s

When the symbol is scanned, the augmented memory’s appear over the object of interest. These memory’s are tagged to the object, and relate to it. He can then select a memory, and view all of the photographs and notes linked inside of the memory, he can then like or share the memory via facebook.

Adding Memory’s

George can now add memory’s, using the button located to the top of the application, which allows him to enter in his own memory, using a photograph from either facebook or one he has uploaded to the trawler application.

When he entered the memory in, it is then tagged to the object and falls under that category.

George, Gets Another Email.

George now recieve’s another congratulation’s message, telling him his memory has been uploaded to the trawler database, and can now be viewed at the main exhibition, another large touch screen device that includes all of the memory’s that have been uploaded.

He Gets To The Main Exhibition

He can now view all of the memory’s that have been uploaded all in one.

Interactive Exhibition: Wire Frames (Rough Draft)

In this post I will showcase some of my draft wire frames, and rough outline of the user journey just so I could start to decide on the positioning of the content and how each page will be laid out. I create two rough drafts of wire frames one for tablet and one for the main device used at the museum.

Tablet Wire Frame (Rough)

Scan_20160105 (6).pngScan_20160105 (7).png

Scan_20160105 (8).pngScan_20160105 (9).png

Scan_20160105 (10).pngScan_20160105 (11).png

Scan_20160105 (12).png

Large Touch Screen Device (Rough)Scan_20160105.pngScan_20160105 (2).pngScan_20160105 (3).pngScan_20160105 (4).pngScan_20160105 (5).png

User Journey (Rough)

Scan_20160105 (13)

Interactive Exhibition: Augmented Maps

During previous research I looked into using ‘Geolocation’ for the maps on the tablet side of my exhibition, later research indicated that this process would not be possible, because the user will be inside of a building and could not find any useful resources to indicate that using ‘Geolocation’ would be possible.

I did however come across a few examples of augmented reality maps, which fits in with my idea of using the camera function on the tablet to guide the user around the museum to area’s of interest.

I came across a company called NavVis who specialize in creating augmented map applications for in doors use, bellow is my research.

Website: NavVis

NavVis uses the same technique as google maps, using sensors and a camera to map out a buildings structure and shape, to create a visual representation of a map. Moving the device from room to room in a building allows the device to pick up all of the information it needs to rout out the map of the building.

Here is a video of the device at work:

Here is an image that shows off the functionality of this mapping service:

As you can see, the individual in the photograph is using her mobile phone to direct herself to a certain area of interest, using an augmented trail that leads her straight to it.

It also states on there website that the ‘NavVis Navigator’ has an open api which indicates I can use this with the application on the tablet. Also it states that a third party owner of the NavVis Navigator can implement there own augmented reality overlays, over the top of the map.

Can This Be Implemented?
I can conclude from this piece of research can be implemented into my exhibition, and is exactly what I was looking for.