Measuring the Arrangement of Multiple Information Devices by Observing Their User’s Face

  • Saori Kikutani
  • Koh KakushoEmail author
  • Takeshi Okadome
  • Masaaki Iiyama
  • Satoshi Nishiguchi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9189)


We propose to measure the 3D arrangement of multiple portable information devices operated by a single user from his/her facial images captured by the cameras installed on those devices. Since it becomes quite usual for us to use multiple information devices at the same time, previous works have proposed various styles of cooperation among the devices for data transmission and so on. Other previous works propose to coordinate the screens so that they share the role of displaying contents larger than each screen. Those previous works obtain the 2D tiled arrangement of the screens by detecting their contacts with each other using sensing hardware equipped on their edges. Our method estimates the arrangement among the devices in various 3D positions and orientations in relation to the user’s face from its appearance in the image captured by the camera on each device.


Multiple portable devices Device coordination Screen arrangement Facial image processing Camera calibration 


  1. 1.
    Yatani, K., Tamura, K., Hiroki, K., Sugimoto, M., Hashizume, H.: Toss-it: intuitive information transfer techniques for mobile devices using toss and swing actions. IEICE Trans. Inf. Syst. E89-D(1), 150–157 (2006)CrossRefGoogle Scholar
  2. 2.
    Dippon, A., Widermann, N., Klinker, G.: Seamless integration of mobile devices into interactive surface environments. In: ACM International Conference on Interactive Tabletops and Surfaces (ITS 2012), pp. 331–334 (2012)Google Scholar
  3. 3.
    Schmidt, D., Seifert, J., Rukzio, E., Gellersen, H.: A cross-device interaction style for mobiles and surfaces. In: Designing Inteactive Systems Conferece (DIS 2012), pp. 318–327 (2012)Google Scholar
  4. 4.
    Seifert, J., Dobbelstein, D., Schmidt, D., Holleis, P., Rukzio, E.: From the private into the public: privacy-respecting mobile interaction techniques for sharing data on surfaces. Pers. Ubiquit. Comput. 18(4), 1013–1026 (2013)CrossRefGoogle Scholar
  5. 5.
    Hahne, J., Schild, J., Elstner, S., Alexa, M.: Multi-touch focus+context sketch-based interaction. In: EUROGRAPHICS Symposium on Sketch-Based Interfaces and Modeling, pp. 77–83 (2009)Google Scholar
  6. 6.
    Baur, D., Boring, S., Feinter, S.: Virtual projection: exploring optical projection as a metaphor for multi-device interaction. In: ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2012), pp. 1693–1702 (2012)Google Scholar
  7. 7.
    Hinckley, K.: Synchronous gestures for multiple persons and computers. In: ACM Symposium on User Interface Software and Technology (UIST 2003), pp. 149–158 (2003)Google Scholar
  8. 8.
    Schneider, D., Rukzio, J.J.E.: MobIES: extending mobile interfaces using external screens. In: International Conference on Mobile and Ubiquitous Multimedia (MUM 2012), pp. 59:1–59:2 (2012)Google Scholar
  9. 9.

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Saori Kikutani
    • 1
  • Koh Kakusho
    • 1
    Email author
  • Takeshi Okadome
    • 1
  • Masaaki Iiyama
    • 2
  • Satoshi Nishiguchi
    • 3
  1. 1.School of Science and TechnologyKwansei Gakuin UniversitySandaJapan
  2. 2.Academic Center for Computing and Media StudiesKyoto UniversityKyotoJapan
  3. 3.Faculty of Information Science and TechnologyOsaka Institute of TechnologyHirakataJapan

Personalised recommendations