Advertisement

Making Blind People Autonomous in the Exploration of Tactile Models: A Feasibility Study

  • Francesco Buonamici
  • Rocco FurferiEmail author
  • Lapo Governi
  • Yary Volpe
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9176)

Abstract

Blind people are typically excluded from equal access to the world’s visual culture, thus being often unable to achieve concrete benefits of art education and enjoyment. This is particularly true when dealing with paintings due to their bi-dimensional nature impossible to be explored using the sense of touch. This may be partially overcome by translating paintings into tactile bas-reliefs. However, evidence from recent studies suggests that the mere tactile exploration is often not sufficient to fully understand and enjoy bas-reliefs. The integration of different sensorial stimuli proves to dramatically enrich the haptic exploration. Moreover, granting blind people the possibility of autonomously accessing and enjoying pictorial works of art, is undoubtedly a good strategy to enrich their exploration. Accordingly, the main aim of the present work is to assess the feasibility of a new system consisting of a physical bas-relief, a vision system tracking the blind user’s hands during “exploration” and an audio system providing verbal descriptions. The study, supported by preliminary tests, demonstrates the effectiveness of such an approach capable to transform a frustrating, bewildering and negative experience (i.e. the mere tactile exploration) into one that is liberating, fulfilling, stimulating and fun.

Keywords

Cultural heritage Blind people Hand tracking Human-computer interaction 

References

  1. 1.
    Li, Z., Wang, S., Yu, J., Ma, K.L.: Restoration of brick and stone relief from single rubbing images. IEEE Trans. Vis. Comput. Graph. 18(2), 177–187 (2012)CrossRefGoogle Scholar
  2. 2.
    Reichinger, A., Maierhofer, S., Purgathofer, W.: High-quality tactile paintings. J. Comput. Cult. Heritage (JOCCH) 4(2), 1–13 (2011)CrossRefGoogle Scholar
  3. 3.
    Reichinger, A., Neumüller, M., Rist, F., Maierhofer, S., Purgathofer, W.: Computer-aided design of tactile models. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012, Part II. LNCS, vol. 7383, pp. 497–504. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  4. 4.
    Governi, L., Furferi, R., Volpe, Y., Puggelli, L., Vanni, N.: Tactile exploration of paintings: an interactive procedure for the reconstruction of 2.5D models. In: Proceeding of 2014 22nd Mediterranean Conference on Control and Automation (MED), Palermo, Italy (2014)Google Scholar
  5. 5.
    Volpe, Y., Furferi, R., Governi, L., Tennirelli, G.: Computer-based methodologies for semi-automatic 3D model generation from paintings. Int. J. Comput. Aided Eng. Technol. 6(1), 88–112 (2014)CrossRefGoogle Scholar
  6. 6.
    Governi, L., Furferi, R., Volpe, Y., Vanni, N.: Tactile 3D bas-relief from single-point perspective paintings: a computer based method. J. Inf. Comput. Sci. 11(16), 1–14 (2014)Google Scholar
  7. 7.
    Governi, L., Carfagni, M., Furferi, R., Puggelli, L., Volpe, Y.: Digital bas-relief design: a novel shape from shading-based method. Comput.-Aided Des. Appl. 11(2), 153–164 (2014)Google Scholar
  8. 8.
    Governi, L., Furferi, R., Volpe, Y., Carfagni, M., Puggelli, L., Vanni, N.: From 2D to 2.5D i.e. from painting to tactile model. Graph. Models 76(6), 706–723 (2014)CrossRefGoogle Scholar
  9. 9.
    Hayhoe, S.: An enquiry into passive and active exclusion from sensory aesthetics in museums and on the Web: two case studies of final year students at California school for the blind studying art works through galleries and on the web. Brit. J. Visual Impairment 32(1), 44–58 (2014)CrossRefGoogle Scholar
  10. 10.
    Hayhoe, S.: Arts, Culture and Blindness: Studies of Blind Students in the Visual Arts. Teneo Press, Youngstown (2008)Google Scholar
  11. 11.
  12. 12.
    Oikonomidis, I., Kyriazis, N., Argyros, A.A.: Efficient Model-Based 3d Tracking of Hand Articulations Using Kinect. BMVC, Dundee (2011)CrossRefGoogle Scholar
  13. 13.
    Kennedy, J., Eberhart, R.: Particle Swarm Optimization. In: International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)Google Scholar
  14. 14.
    Bentley, J.L.: Multidimensional binary search trees used for associative searching. Comm. ACM 18(9), 509–517 (1975)MathSciNetCrossRefGoogle Scholar
  15. 15.
  16. 16.
    Pavlovic, V.I., Sharma, R., Huang, T.S.: Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Trans. Pattern Anal. Mach. Intell. 19(7), 677–695 (1997)CrossRefGoogle Scholar
  17. 17.
    Malik, S., Laszlo, J.: Visual touchpad: A two-handed gestural input device. In: ICMI 2004 - Sixth International Conference on Multimodal Interfaces, pp. 289–296 (2004)Google Scholar
  18. 18.
    Imagawa, K., Lu, S., Igi, S.: Color-based hands tracking system for sign language recognition. In: Proceedings - 3rd IEEE International Conference on Automatic Face and Gesture Recognition, pp. 462–467 (1998)Google Scholar
  19. 19.
    Wang, R.Y., Paris, S., Popovic, J.: 6D hands: Markerless hand tracking for computer aided design.In: UIST 2011 - Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 549–557 (2011)Google Scholar
  20. 20.
    Menelas, B., Hu, Y., Lahamy, H., Lichti, D.: Haptic and gesture-based interactions for manipulating geological datasets. In: Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, pp. 2051–2055 (2011)Google Scholar
  21. 21.
    González-Ortega, D., Díaz-Pernas, F.J., Martínez-Zarzuela, M., Antón-Rodríguez, M.: A Kinect-based system for cognitive rehabilitation exercises monitoring. Comput. Methods Programs Biomed. 113(2), 620–631 (2014)CrossRefGoogle Scholar
  22. 22.
    Boccanfuso, L., O’Kane, J.M.: CHARLIE: an adaptive robot design with hand and face tracking for use in autism therapy. Int. J. Soc. Robot. 3(4), 337–347 (2011)CrossRefGoogle Scholar
  23. 23.
    Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: Vision-based hand pose estimation: a review. Comput. Vis. Image Underst. 108(1–2), 52–73 (2007)CrossRefGoogle Scholar
  24. 24.
    De La Gorce, M., Paragios, N., Fleet, D.J.: Model-based Hand Tracking With Texture, Shading and Self-occlusions. In: CVPR, pp. 1–8, June 2008Google Scholar
  25. 25.
    Oikonomidis, I., Kyriazis, N., Argyros, A.A.: Markerless and efficient 26-DOF hand pose recovery. In: Kimmel, R., Klette, R., Sugimoto, A. (eds.) ACCV 2010, Part III. LNCS, vol. 6494, pp. 744–757. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  26. 26.
    Oikonomidis, I., Kyriazis, N., Argyros, A.A.: Full DOF tracking of a hand interacting with an object by modeling occlusions and physical constraints. In: ICCV, pp. 2088–2095, November 2011Google Scholar
  27. 27.
    Oikonomidis, I., Kyriazis, N., Argyros, A.A.: Tracking the articulated motion of two strongly interacting hands. In: Proceedings of IEEE Conference Computer Vision Pattern Recognition, pp. 1862 − 1869 (2012)Google Scholar
  28. 28.
    Salvi, J., Matabosch, C., Fofi, D., Forest, J.: A review of recent range image registration methods with accuracy evaluation. IVC 25, 578–596 (2007)CrossRefGoogle Scholar
  29. 29.
    Besl, P., McKay, N.: A method for registration of 3-d shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 239–256 (1992)CrossRefGoogle Scholar
  30. 30.
    Barber, C.B., Dobkin, D.P., Huhdanpaa, H.T.: The Quickhull Algorithm for Convex Hulls. ACM Trans. Math. Softw. 22(4), 469–483 (1996)MathSciNetCrossRefGoogle Scholar
  31. 31.

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Francesco Buonamici
    • 1
  • Rocco Furferi
    • 1
    Email author
  • Lapo Governi
    • 1
  • Yary Volpe
    • 1
  1. 1.Department of Industrial Engineering of FlorenceUniversity of Florence (Italy)FlorenceItaly

Personalised recommendations