A longer while ago I’ve been blogging on the company Azavea and their mobile AR prototype to support cultural heritage browsing and knowledge access. They teamed up with PhillyHistory, that lets you find historical photos and maps by date, neighborhood, address, and/or keyword in a collection of over 90,000 records.
Triggered by the Philadelphia Department of Records (DOR), Azavea started to research on the use of mobile augmented reality technology in displaying a large database of historic photographs. The app enables users to view historic images as overlays on the current landscape. After the research period Azavea published a white paper called “Implementing Mobile Augmented reality Technology for Viewing Historic Images”, that should answer the before asked questions like: Is augmented reality (AR) a useful method for showing the history of Philadelphia and helping users to see the connections between the past and the present? Is AR technology advanced enough to make this type of application possible? Are smartphone networks fast enough? Is the phone able to pinpoint a user’s location accurately enough to load images of that location even in a crowded urban setting?
That’s why I spoke to Deborah Boyer from Azavea to learn a little bit more first hand about the results. Deb allowed me to publish her answers for you all! Thanks again! It’s only some answers here, the white paper has a lot more. So, now: enjoy!
augmented.org: Was the GPS Positioning accurate enough for your purpose in the end?
Deborah: The GPS positioning was accurate enough although it could certainly be improved. Our biggest issue was with the jitteriness of the images. They often bounced or wobbled and simply didn’t stay in one place as well as we would have liked. Use of the gyroscope in newer smartphones should hopefully improve this in future apps. We talk more about the positioning issues on p. 22 of the white paper.
Are processing times fast enough for a stable and good experience?
When we chose to display nearly 90,000 points in the app, load time started to become an issue. We sped up the app by choosing to display only the four closest points as images. Other nearby points load as icons (which can be cached) and the user can then click to load the image as an overlay. We also only return a select number of POIs for any location. More information about processing times is available on pages 15-17 of the white paper.
Regarding input/output devices what are your learnings? what is missing today?
The relatively small screen size of a smartphone required us to pay careful attention to what we could display without overwhelming or confusing the user. We created an opening launch screen to help guide users through the app and used the Layar customization options as much as possible. The images were often very small in the AR view so we also built an option for users to load and view the image on another screen where it could be a larger size. This screen also contained additional descriptive information. While we tried to use the screen space as well as possible, it is still a fairly small space in which to show a lot of information. Tablets may be useful for their larger screen although the weight and size could make it difficult to easily use them with an AR app for long periods of time.
How do you prioritize POIs if there are too many? How do you guide the user?
We organized the POIs into three categories: twenty images with additional accompanying text, 500 images which were aligned in 3D space, and the remaining images (roughly 87,000) which made up the majority of the collection. The app prioritizes the images with additional text or the 3D aligned images over the other points. The prioritization also includes distance calculations to prevent POIs from appearing on top of each other. To guide the user, we created three different icons to indicate the different types of images. We also wrote a help page to walk people through the process of using the app. More details about prioritization are available on pages 16, 17, and 21 of the white paper.
Did you try prototypes with tablet devices?
We did not try the prototype with tablet devices. At the time of development, tablets with cameras were not readily available so we stuck to smartphones.
How was the user feedback, what are your results on usability/effectivity? E.g. does placing images in 3D space work better screen-aligned or “3D-space-aligned”?
In general, users were excited about the project and the research. When we released the prototype, we realized that users may have some difficulty viewing the aligned images due to the jitter that is often found in AR applications. We received some feedback that the images bounced around too much and made viewing difficult. Other users reported no problems. While the 3D alignment helped in some cases, other users found it confusing.
In the end: was the program useful and accepted? Will you continue in cultural heritage projects with AR?
The project was useful for investigating AR technology and its possible use in cultural institutions. We received a positive response to the white paper and hope that it will serve as a useful resource for other organizations investigating the use of augmented reality. We are interested in pursuing other AR projects and potentially expanding the PhillyHistory project but have no immediate plans.
Talking about a possible open source AR platform: what is most important for them and will you be forcing this, too?
With the variety of collections and media types available in cultural institutions, an open source AR platform could be useful to providing additional features and flexibility to organizations hoping to experiment with AR. The publication of open source AR standards and the creation of a community around open source AR would hopefully allow for more innovations and sharing of resources. We’re interested in the topic of open source AR but are not currently working on any open source AR projects.
The App was only a prototype and only available for a few month. But you can still try it using layar with a dedicated channel or read the full white paper for more research results! Thanks again, Azavea and of course, Deb for taking the time!
Have a nice week you all!