Abstract
Social interactions mediate our communication with others, enable development and maintenance of personal and professional relationships, and contribute greatly to our health. While both verbal cues (i.e., speech) and non-verbal cues (e.g., facial expressions, hand gestures, and body language) are exchanged during social interactions, the latter encompasses more information (~65%). Given their inherent visual nature, non-verbal cues are largely inaccessible to individuals who are blind, putting this population at a social disadvantage compared to their sighted peers. For individuals who are blind, embarrassing social situations are not uncommon due to miscommunication, which can lead to social avoidance and isolation. In this paper, we propose a mapping between visual facial expressions, represented as facial action units, which may be extracted using computer vision algorithms, to haptic (vibrotactile) representations, toward discreet and real-time perception of facial expressions during social interactions by individuals who are blind.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Knapp, M.L.: Nonverbal Communication in Human Interaction. Harcourt College, ‎San Diego (1996)
Segrin, C., Flora, J.: Poor social skills are a vulnerability factor in the development of psychosocial problems. Hum. Commun. Res. 26(3), 489–514 (2000)
Ekman, P., Friesen, W.V., Hager, J.C.: The Facial Action Coding System: A Technique for the Measurement of Facial Movements. Consulting Psychologists, Palo Alto (2002)
Valstar, M.F., Pantic, M.: Biologically vs. logic inspired encoding of facial actions and emotions in video. In: IEEE International Conference on Multimedia & Expo, pp. 325–328 (2006)
Seeing Machines. https://www.seeingmachines.com. Accessed 24 Nov 2017
IMOTIONS. https://imotions.com. Accessed 24 Nov 2017
De la Torre, F., et al.: IntraFace. In: 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition, pp. 1–8 (2015)
Qiu, S., Rauterberg, M., Hu, J.: Designing and evaluating a wearable device for accessing gaze signals from the sighted. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2016, Part I. LNCS, vol. 9737, pp. 454–464. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40250-5_43
McDaniel, T., Krishna, S., Balasubramanian, V., Colbry, D., Panchanathan, S.: Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. In: IEEE Haptics Audio-Visual Environments and Games Conference, pp. 13–18 (2008)
McDaniel, T., Villanueva, D., Krishna, S., Colbry, D., Panchanathan, S.: Heartbeats: a methodology to convey interpersonal distance through touch. In: ACM Conference on Human Factors in Computing Systems, pp. 3985–3990 (2010)
Krishna, S., Bala, S., McDaniel, T., McGuire, S., Panchanathan, S.: VibroGlove: an assistive technology aid for conveying facial expressions. In: ACM Conference on Human Factors in Computing Systems, pp. 3637–3642 (2010)
Buimer, H.P., Bittner, M., Kostelijk, T., van der Geest, T.M., van Wezel, R.J.A., Zhao, Y.: Enhancing emotion recognition in vips with haptic feedback. In: Stephanidis, C. (ed.) HCI 2016, Part II. CCIS, vol. 618, pp. 157–163. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40542-1_25
Réhman, S.U., Liu, L.: Vibrotactile rendering of human emotions on the manifold of facial expressions. J. Multimedia 3(3), 18–25 (2008)
Rahman, A., Anam, A.I., Yeasin, M.: EmoAssist: emotion enabled assistive tool to enhance dyadic conversation for the blind. Multimedia Tools Appl. 76(6), 7699–7730 (2017)
Bala, S., McDaniel, T., Panchanathan, S.: Visual-to-tactile mapping of facial movements for enriched social interactions. In: IEEE International Symposium on Haptic, Audio and Visual Environments and Games, pp. 82–87 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
McDaniel, T., Devkota, S., Tadayon, R., Duarte, B., Fakhri, B., Panchanathan, S. (2018). Tactile Facial Action Units Toward Enriching Social Interactions for Individuals Who Are Blind. In: Basu, A., Berretti, S. (eds) Smart Multimedia. ICSM 2018. Lecture Notes in Computer Science(), vol 11010. Springer, Cham. https://doi.org/10.1007/978-3-030-04375-9_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-04375-9_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04374-2
Online ISBN: 978-3-030-04375-9
eBook Packages: Computer ScienceComputer Science (R0)