Advertisement

Balancing Design Freedom and Constraints in Wall Posters Masquerading as AR Tracking Markers

  • Ryuhei Tenmoku
  • Akito Nishigami
  • Fumihisa Shibata
  • Asako Kimura
  • Hideyuki Tamura
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5622)

Abstract

This paper describes how to construct a mixed reality (MR) environment by adopting a geometric registration method using visually unobtrusive flat posters on the wall. The proposed method is one of the several approaches of the semi-fiducial invisibly coded symbols (SFINCS) research project, the purpose of which is achieving a good balance between elegance with regard to the environment and robust registration. In this method, posters tentatively used for geometric registration are designed to blend with the environment. However, they are recognized as markers based on certain design rules. Posters in a real scene can be found in real time using these design rules. This paper introduces procedures for developing poster design rules using toolkits developed by us.

Keywords

mixed reality geometric registration poster semi-fiducial authoring tool 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Feiner, S., MacIntyre, B., Seligman, D.: I Knowledge-based augmented reality. Communications of the ACM 36(7), 52–62 (1993)CrossRefGoogle Scholar
  2. 2.
    Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Juiler, S., MacIntyre, B.: Recent advances in augmented reality. In: Proc. Int. Symp. on Augmented Reality, pp. 111–119 (2001)Google Scholar
  3. 3.
    Kato, H., Billinghurst, M., Imamate, K., Tachibana, K.: Recent advances in augmented reality. In: Proc. Int. Symp. on Augmented Reality, pp. 111–119 (2000)Google Scholar
  4. 4.
    Tenmoku, R., Yoshida, Y., Shibata, F., Kimura, A., Tamura, H.: Visually elegant and robust semi-fiducials for geometric registration in mixed reality. In: Proc. 6th Int. Symp. on Mixed and Augmented Reality, pp. 261–262 (2007)Google Scholar
  5. 5.
    Naimark, L., Foxlin, E.: Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. In: Proc. 1st Int. Symp. on Mixed and Augmented Reality, pp. 27–36 (2002)Google Scholar
  6. 6.
    Thomas, G.: Mixed reality techniques for TV and their application for on-set/pre-visualization in film production. In: DVD Proc. Int. Workshop on Mixed Reality Technology for Filmmaking, pp. 31–36 (2006)Google Scholar
  7. 7.
    Bianchi, G., Jung, C., Knörlein, B., Harders, M., Székely, G.: High-fidelity visuo-haptic interaction with virtual objects in multi-modal AR systems. In: Proc. 5th Int. Symp. on Mixed and Augmented Reality, pp. 187–196 (2006)Google Scholar
  8. 8.
    Gordon, I., Lowe, D.G.: Scene modeling, recognition and tracking with invariant image features. In: Proc. 3rd Int. Symp. on Mixed and Augmented Reality, pp. 110–119 (2004)Google Scholar
  9. 9.
    Reitmayr, G., Drummond, T.W.: Going out: Robust model-besed tracking for outdoor augmented reality. In: Proc. 5th Int. Symp. on Mixed and Augmented Reality, pp. 109–118 (2006)Google Scholar
  10. 10.
    Oe, M., Sato, T., Yokoya, N.: Estimating camera position and posture by using feature landmark database. In: Kalviainen, H., Parkkinen, J., Kaarna, A. (eds.) SCIA 2005. LNCS, vol. 3540, pp. 171–181. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  11. 11.
    Taketomi, T., Sato, T., Yokoya, N.: Real-time geometric registration using feature landmark database for augmented reality applications. In: Proc. SPIE Electronic Imaging, vol. 7238 (2009)Google Scholar
  12. 12.
    Nakazato, Y., Kanbara, M., Yokoya, N.: An initialization tool for installing visual markers in wearable augmented reality. In: Proc. 16th Int. Conf. on Artificial Reality and Telexistence, pp. 228–238 (2006)Google Scholar
  13. 13.
    Saito, S., Hiyama, A., Tanikawa, T., Hirose, M.: Indoor vision based localization using coded seamless pattern for interior decoration. In: Proc. IEEE Virtual Reality 2007, pp. 67–74 (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Ryuhei Tenmoku
    • 1
  • Akito Nishigami
    • 1
  • Fumihisa Shibata
    • 1
  • Asako Kimura
    • 1
  • Hideyuki Tamura
    • 1
  1. 1.Ritsumeikan UniversityShigaJapan

Personalised recommendations