Advertisement

Mobile Interfaces Using Body Worn Projector and Camera

  • Nobuchika Sakata
  • Teppei Konishi
  • Shogo Nishida
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5622)

Abstract

Unlike most desktop computer and laptop, mobile interface are designed to facilitate user operating the information easily with various situations that is standing, walking, and moving. However, almost mobile devices such like cell phones have a small key pad and small display because those devices should keep compact and light weight for bringing and pocketing. Therefore, they impose a lot of burdens to users in terms of watching a small display and typing with a small keyboard. Such devices do not focus to provide implicit and awareness information. In this paper, we describe features of body worn projector, which has capability for projecting information to user’s peripheral vision, and body worn camera, which has capability for recognizing user’s posture and estimating user’s behavior, is suitable interface for providing awareness, implicit, and even explicit information. Finally, we propose two mobile interfaces which are “Palm top display for glance information” and “Floor projection from Lumbar mounted projector”.

Keywords

Mobile AR Wearable Computer Mobile Interface Mobile Projector Procams 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kourogi, M., Kurata, T.: Personal positioning based on walking locomotion analysis with self-contained sensors and a wearable camera. In: ISMAR 2003, pp. 103–112 (2003)Google Scholar
  2. 2.
    Rekimoto, J.: GestureWrist and GesturePad: Unobtrusive wearable interaction devices. In: ISWC 2001, pp. 21–27 (2001)Google Scholar
  3. 3.
    Kurata, T., Okuma, T., Kourogi, M., Sakaue, K.: The hand-mouse: A human interface suitable for augmentedreality environment enabled by visual wearables. TechnicalReport of IEICE (PRMU), pp. 69–76 (2000)Google Scholar
  4. 4.
    Feldman, A., Tapia, E.M., Sadi, S., Maes, P., Schmandt, C.: ReachMedia: On-the-move interaction with everyday objects. In: ISWC 2005, pp. 52–59, (2005)Google Scholar
  5. 5.
    Ueoka, T., Kawamura, T., Kono, Y., Kidode, M.: I’m Here!: a Wearable bject Remembrance Support System. In: MobileHCI 2003, Fifth International Symposium on Human Computer Interaction with obile Devices and Services, pp. 422–427 (2003)Google Scholar
  6. 6.
    Tsukubu, Y., Kosaka, T., Kameda, Y., Nakamura, Y., Ohta, Y.: Video-Based Media for Gently Giving Instructions: Object change detection and working process identification. In: PRMU 2004, pp. 13–18 (2004)Google Scholar
  7. 7.
    Rekimoto, J., Sciammarella, E.: ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices. In: Proc. of UIST 2000, pp. 109–117 (2000)Google Scholar
  8. 8.
    Kourogi, M., Sakata, N., Okuma, T., Kurata, T.: Indoor/Outdoor pedestrian navigation with an embedded gPS/RFID/Self-contained sensor system. In: Pan, Z., Cheok, D.A.D., Haller, M., Lau, R., Saito, H., Liang, R. (eds.) ICAT 2006. LNCS, vol. 4282, pp. 1310–1321. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  9. 9.
    Sakata, N., Kurata, T., Kato, T., Kourogi, M., Kuzuoka, H.: WACL: Supporting Telecommunications Using Wearable Active Camera with Laser Pointer. In: ISWC 2003, NY, USA, pp. 53–56 (2003)Google Scholar
  10. 10.
    Yamamoto, G., Xu, H., Sato, K.: PALMbit-Silhouette. In: Interaction 2008, pp. 109–116 (2008) (in Japanese)Google Scholar
  11. 11.
    Yamamoto, G., Nanbu, S., Xu, H., Sato, K.: PALMbit-Shadow: Accessing by Virtual Shadow. In: The 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan (2007)Google Scholar
  12. 12.
    Starner, T., Weaver, J., Pentland, A.: Real-time American SignLanguage recognition using desk and wearable computer-based video. IEEE Trans. Patt. Analy. and Mach. Intell. 20(12) (1998)Google Scholar
  13. 13.
    Mann, S.: Smart Clothing: TheWearable Computer and WearCam. Personal Technologies 1(1) (1997)Google Scholar
  14. 14.
    Starner, T., Auxier, J., Ashbrook, D., Gandy, M.: The gesture pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: The Fourth International Symposium on Wearable Computers, ISWC 2000 (2000)Google Scholar
  15. 15.
    Kolsch, M., Beall, A., Turk, M.: Postural comfort zone for reaching gestures. In: Human Factors and Ergonomics Society Annual Meeting (2003)Google Scholar
  16. 16.
    Kolsch, M., Beall, A., Turk, M.: An objective measure for postural comfort. In: Human Factors and Ergonomics Society Annual Meeting (2003)Google Scholar
  17. 17.
    Wither, J., DiVerdi, S., Hollerer, T.: Evaluating Display Types for AR Selection and Annotation. In: International Symposium on Mixed and Augmented Reality (2007)Google Scholar
  18. 18.
    Mayol, W.W., Tordoff, B., Murray, D.W.: Designing a miniature wearable visual robot. In: ICRA, pp. 3725–3730 (2002)Google Scholar
  19. 19.
    Chaffin, D.B.: Localized Muscle Fatigue Definition and Measurement. Journal of Occupational Medicine 15(4), 346–354 (1973)Google Scholar
  20. 20.
    Wither, J., Di Verdi, S., Hollerer, T.: Evaluating Display Types for AR Selection and Annotation. In: International Symposium on Mixed and Augmented Reality 2007, pp. 95–98 (2007)Google Scholar
  21. 21.
  22. 22.
    Mistry, P., Maes, P., Chang, L.: WUW: Wear Ur World - A Wearable Gestural Interface. In: CHI 2009 extended abstracts on Human factors in computing systems, Boston, USA (to appear, 2009)Google Scholar
  23. 23.
    Maes, P., Mistry, P.: The Sixth Sense. TED talk in Reframe session. In: TED 2009, Long Beach, CA, USA (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Nobuchika Sakata
    • 1
  • Teppei Konishi
    • 1
  • Shogo Nishida
    • 1
  1. 1.Division of Systems Science and Applied Informatics Graduate School of Engineering ScienceOsaka UniversityToyonaka city,OsakaJapan

Personalised recommendations