Abstract
Previous research has revealed that people are generally poor at distinguishing genuine and acted anger facial expressions, with a mere 65% accuracy of verbal answers. We aim to investigate whether a group of feedforward neural networks can perform better using raw pupillary dilation signals from individuals. Our results show that a single neural network cannot accurately discern the veracity of an emotion based on raw physiological signals, with an accuracy of 50.5%. Nonetheless, distinct neural networks using pupillary dilation signals from different individuals display a variety of genuineness for discerning the anger emotion, from 27.8% to 83.3%. By leveraging these differences, our novel Misaka neural networks can compose predictions using different individuals’ pupillary dilation signals to give a more accurate overall prediction than even from the highest performing single individual, reaching an accuracy of 88.9%. Further research will involve the investigation of the correlation between two groups of high-performing predictors using verbal answers and pupillary dilation signals.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Andreassi, J.L.: Psychophysiology: Human Behavior & Physiological Response, 5th edn. Lawrence Erlbaum Associates Publishers, Mahwah (2007)
Aviezer, H., Hassin, R., Bentin, S., Trope, Y.: Putting facial expressions back in context. In: Ambady, N., Skowronsky, J.J. (eds.) First Impressions, chap. 11, pp. 255–286. Guilford Press, New York (2008)
Batty, M., Taylor, M.J.: Early processing of the six basic facial emotional expressions. Cogn. Brain Res. 17(3), 613–620 (2003)
Chanel, G., Ansari-Asl, K., Pun, T.: Valence-arousal evaluation using physiological signals in an emotion recall paradigm. In: 2007 IEEE International Conference on Systems, Man and Cybernetics, pp. 2662–2667 (2007)
Chen, L., Gedeon, T., Hossain, M.Z., Caldwell, S.: Are you really angry?: detecting emotion veracity as a proposed tool for interaction. In: Proceedings of the 29th Australian Conference on Computer-Human Interaction, Brisbane, Queensland, Australia, pp. 412–416. ACM (2017)
Dalmaijer, E.: Is the low-cost eyetribe eye tracker any good for research? Technical report, PeerJ PrePrints (2014)
Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992)
Frood, A.: Work the crowd. New Sci. 237(3166), 32–35 (2018)
Gao, Y., Xiao, F., Liu, J., Wang, R.: Distributed soft fault detection for interval type-2 fuzzy-model-based stochastic systems with wireless sensor networks. IEEE Trans. Ind. Inf. (2018, early access version)
de Gee, J.W., Knapen, T., Donner, T.H.: Decision-related pupil dilation reflects upcoming choice and individual bias. Proc. Nat. Acad. Sci. 111(5), E618–E625 (2014)
Goldinger, S.D., Papesh, M.H.: Pupil dilation reflects the creation and retrieval of memories. Curr. Dir. Psychol. Sci. 21(2), 90–95 (2012)
Haimura, M.: A Certain Magical Index. ASCII Media Works, Tokyo (2013)
Hess, E.H., Polt, J.M.: Pupil size in relation to mental activity during simple problem-solving. Science 143(3611), 1190–1192 (1964)
Hornik, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neural Netw. 2(5), 359–366 (1989)
Jirayucharoensak, S., Pan-Ngum, S., Israsena, P.: EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. (2014)
Kahneman, D., Beatty, J.: Pupil diameter and load on memory. Science 154(3756), 1583–1585 (1966)
Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term monitoring of physiological signals. Med. Biol. Eng. Comput. 42(3), 419–427 (2004)
Lang, P.J.: The emotion probe: studies of motivation and attention. Am. Psychol. 50(5), 372 (1995)
Manski, C.F.: Interpreting the predictions of prediction markets. Econ. Lett. 91(3), 425–429 (2006)
Mellers, B., et al.: Identifying and cultivating superforecasters as a method of improving probabilistic predictions. Perspect. Psychol. Sci. 10(3), 267–281 (2015)
Papesh, M.H., Goldinger, S.D., Hout, M.C.: Memory strength and specificity revealed by pupillometry. Int. J. Psychophysiol. 83(1), 56–64 (2012)
Partala, T., Jokiniemi, M., Surakka, V.: Pupillary responses to emotionally provocative stimuli. In: Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, pp. 123–129. ACM (2000)
Pletti, C., Scheel, A., Paulus, M.: Intrinsic altruism or social motivationwhat does pupil dilation tell us about children’s helping behavior? Front. Psychol. 8, 2089 (2017)
Polgreen, P.M., Nelson, F.D., Neumann, G.R., Weinstein, R.A.: Use of prediction markets to forecast infectious disease activity. Clin. Infect. Dis. 44(2), 272–279 (2007)
Qin, Z., Gedeon, T., Caldwell, S.: Neural networks assist crowd predictions in discerning the veracity of emotional expressions. arXiv Preprint arXiv:1808.05359 (2018)
Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)
Steinhauer, S.: Pupillary dilation to emotional visual stimuli revisited. Psychophysiology 20, S472 (1983)
Steinhauer, S.R., Siegle, G.J., Condray, R., Pless, M.: Sympathetic and parasympathetic innervation of pupillary dilation during sustained processing. Int. J. Psychophysiol. 52(1), 77–86 (2004)
Wagner, J., Kim, J., André, E.: From physiological signals to emotions: implementing and comparing selected methods for feature extraction and classification. In: IEEE International Conference on Multimedia and Expo, ICME 2005, Amsterdam, Netherlands, pp. 940–943. IEEE (2005)
Wolfers, J., Zitzewitz, E.: Prediction markets. J. Econ. Perspect. 18(2), 107–126 (2004)
Acknowledgments
The authors acknowledge Dongyang Li, Liang Zhang and Zihan Wang for the suggestion of applying Bayes’ theorem in the probability calculation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Qin, Z., Gedeon, T., Chen, L., Zhu, X., Hossain, M.Z. (2018). Artificial Neural Networks Can Distinguish Genuine and Acted Anger by Synthesizing Pupillary Dilation Signals from Different Participants. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11305. Springer, Cham. https://doi.org/10.1007/978-3-030-04221-9_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-04221-9_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04220-2
Online ISBN: 978-3-030-04221-9
eBook Packages: Computer ScienceComputer Science (R0)