Abstract
Augmented reality on mobile phones has recently made major progress. Lightweight, markerless object recognition and tracking makes handheld Augmented Reality feasible for new application domains. As this field is technology driven the interface design has mostly been neglected. In this paper we investigate visualization techniques for augmenting printed documents using handheld Augmented Reality. We selected the augmentation of printed photo books as our application domain because photo books are enduring artefacts that often have online galleries containing further information as digital counterpart. Based on an initial study, we designed two augmentations and three techniques to select regions in photos. In an experiment, we compare an augmentation that is aligned to the phone’s display with an augmentation aligned to the physical object. We conclude that an object aligned presentation is more usable. For selecting regions we show that participants are more satisfied using simple touch input compared to Augmented Reality based input techniques.
Chapter PDF
Similar content being viewed by others
References
Alessandro, M., Dünser, A., Schmalstieg, D.: Zooming interfaces for augmented reality browsers. In: Proc. MobileHCI (2010)
Atkins, C.: Blocked recursive image composition. In: Proc. ACMMM (2008)
Chin, J., Diehl, V., Norman, K.: Development of an instrument measuring user satisfaction of the human-computer interface. In: Proc. CHI (1988)
Crabtree, A., Rodden, T., Mariani, J.: Collaborating around collections: informing the continued development of photoware. In: Proc. CSCW (2004)
Davies, N., Cheverst, K., Dix, A., Hesse, A.: Understanding the role of image recognition in mobile tour guides. In: Proc. MobileHCI (2005)
Erol, B., Antúnez, E., Hull, J.: HOTPAPER: multimedia interaction with paper using mobile phones. In: Proc. ACMMM (2008)
Fitzmaurice, G.W.: Situated information spaces and spatially aware palmtop computers. Communications of the ACM 36(7) (1993)
Frohlich, D., Kuchinsky, A., Pering, C., Don, A., Ariss, S.: Requirements for photoware. In: Proc. CSCW (2002)
Hart, S., Staveland, L.: Development of NASA-TLX: Results of empirical and theoretical research. Human mental workload 1 (1988)
Henze, N., Boll, S.: Snap and share your photobooks. In: Proc. ACMMM (2008)
Henze, N., Boll, S.: Designing a CD augmentation for mobile phones. In: Ext. Abstracts CHI (2010)
Henze, N., Boll, S.: Evaluation of an Off-Screen Visualization for Magic Lens and Dynamic Peephole Interfaces. In: Proc. MobileHCI (2010)
Henze, N., Schinke, T., Boll, S.: What is That? Object Recognition from Natural Features on a Mobile Phone. In: Proc. MIRW (2009)
Hull, J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J., Van Olst, D.: Paper-Based Augmented Reality. In: Proc. ICAT (2007)
International Organization for Standardization: Information Technology. Automatic Identification and Data Capture Techniques - Bar Code Symbology - QR Code. In ISO/IEC 18004 (2000)
Liao, C., Liu, Q., Liew, B., Wilcox, L.: Pacer: Fine-grained interactive paper via camera-touch hybrid gestures on a cell phone. In: Proc. CHI (2010)
Lowe, D.G.: Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision 60(2) (2004)
Morrison, A., Oulasvirta, A., Peltonen, P., Lemmela, S., Jacucci, G., Reitmayr, G., Näsänen, J., Juustila, A.: Like bees around the hive: a comparative study of a mobile augmented reality map. In: Proc. CHI (2009)
Nister, D., Stewenius, H.: Scalable Recognition with a Vocabulary Tree. In: Proc. CVPR (2006)
Pielot, M., Henze, N., Nickel, C., Menke, C., Samadi, S., Boll, S.: Evaluation of Camera Phone Based Interaction to Access Information Related to Posters. In: Proc. MIRW (2008)
Rohs, M., Gfeller, B.: Using camera-equipped mobile phones for interacting with real-world objects. In: Proc. PERVASIVE (2004)
Rohs, M., Oulasvirta, A.: Target acquisition with camera phones when used as magic lenses. In: Proc. CHI (2008)
Rohs, M., Schöning, J., Raubal, M., Essl, G., Krüger, A.: Map navigation with mobile devices: virtual versus physical movement with and without visual context. In: Proc. ICMI (2007)
Sandhaus, P., Boll, S.: From usage to annotation: analysis of personal photo albums for semantic photo understanding. In: Proc. WSM (2009)
Sellen, A., Harper, R.: The myth of the paperless office. The MIT Press, Cambridge (2003)
Wagner, D., Schmalstieg, D.: History and Future of Tracking for Mobile Phone Augmented Reality. In: Proc. ISUVR (2009)
Wagner, D., Schmalstieg, D., Bischof, H.: Multiple target detection and tracking with guaranteed framerates on mobile phones. In: Proc. ISMAR (2009)
Want, R., Fishkin, K.P., Gujar, A., Harrison, B.L.: Bridging physical and virtual worlds with electronic tags. In: Proc. CHI (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 IFIP International Federation for Information Processing
About this paper
Cite this paper
Henze, N., Boll, S. (2011). Who’s That Girl? Handheld Augmented Reality for Printed Photo Books. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2011. INTERACT 2011. Lecture Notes in Computer Science, vol 6948. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23765-2_10
Download citation
DOI: https://doi.org/10.1007/978-3-642-23765-2_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-23764-5
Online ISBN: 978-3-642-23765-2
eBook Packages: Computer ScienceComputer Science (R0)