Abstract
With the increasing demand for automated agents able to communicate with humans, a lot of progress has been made in the field of artificial intelligence in order to produce conversational agents able to sustain open or topic-restricted conversations. Still, they remain far from the capacity of interaction displayed by humans. This article highlights the challenges still faced in artificial social interaction regarding the contextualization of utterances within a conversation, either in chatbots or in more complex social robots, through processing of the pragmatic clues of conversations, using current knowledge in psychology and linguistics. It also suggests a number of points of interest for the development of artificial agents aimed at improving their communication with humans, the relevance of their utterances, and the relationship with the people interacting with them. We believe that in order to be recognized as a social agent, an artificial agent must follow similar rules humans follow themselves when conversing with each other.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
This conversation is from a public message on Twitter, available at https://twitter.com/zochats/status/1009141014827761664. It is possible to chat with the bot directly on the same page, through private messages.
References
Sperber, D., Wilson, D.: Relevance: Communication and Cognition, 2nd edn. Blackwell, Oxford (1995)
Saygin, A.P., Cicekli, I.: Pragmatics in human-computer conversations. J. Pragmat. 34(3), 227–258 (2002)
Jacquet, B., Jamet, F., Baratgin, J.: Looking for humanity: the influence of inferences on the humanness of a conversational partner in the turing test. In: 3rd Workshop on Virtual Social Interaction (VSI), CITEC, Bielefeld, Germany (2017)
Jacquet, B., Baratgin, J., Jamet, F.: The gricean maxims of quantity and of relation in the turing test. In: 11th International Conference on Human System Interaction (HSI), pp. 332–338 (2018)
Chakrabarti, C., Luger, G.F.: Artificial conversations for customer service chatter bots: architecture, algorithms, and evaluation metrics. Expert Syst. Appl. 42(20), 6878–6897 (2015)
Gockley, R., et al.: Designing robots for long-term social interaction. In: International Conference on Intelligent Robots and Systems, pp. 1338–1343 (2005)
Masson, O., Baratgin, J., Jamet, F.: NAO robot and the “endowment effect”. In: IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO), pp. 1–6 (2015)
Masson, O., Baratgin, J., Jamet, F.: NAO robot, transmitter of social cues: what impacts?. In: International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, pp. 559–568. Springer, Cham (2017)
Masson, O., Baratgin, J., Jamet, F.: NAO robot as experimenter: social cues emitter and neutralizer to bring new results in experimental psychology. In: IEEE International Conference on Information and Digital Technologies (IDT), pp. 256–264 (2017)
Mutlu, B., Shiwa, T., Kanda, T., Ishiguro, H., Hagita, N.: Footing in human-robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction (HRI), pp. 61–68 (2009)
Ham, J., Bokhorst, R., Cuijpers, R., van der Pol, D., Cabibihan, J.J.: Making robots persuasive: the influence of combining persuasive strategies (gazing and gestures) by a storytelling robot on its persuasive power. In: International Conference on Social Robotics, pp. 71–83 (2011)
Weizenbaum, J.: ELIZA - a computer program for the study of natural language communication between man and machine. Commun. ACM 9, 36–45 (1966)
Wallace, R.S.: The anatomy of ALICE. In: Parsing the Turing Test, pp. 181–210 (2009)
Elizabeth for Windows. http://www.philocomp.net/ai/elizabeth.htm
Woebot Labs Inc. https://woebot.io
X2AI Inc. http://x2ai.com/
Microsoft Inc. https://www.zo.ai/
Broekens, J., Heerink, M., Rosendal, H.: Assistive social robots in elderly care: a review. Gerontechnology 8(2), 94–103 (2009)
Motley, M.T.: Facial affect and verbal context in conversation: facial expression as interjection. Hum. Commun. Res. 20(1), 3–40 (1993)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Jacquet, B., Masson, O., Jamet, F., Baratgin, J. (2019). On the Lack of Pragmatic Processing in Artificial Conversational Agents. In: Ahram, T., Karwowski, W., Taiar, R. (eds) Human Systems Engineering and Design. IHSED 2018. Advances in Intelligent Systems and Computing, vol 876. Springer, Cham. https://doi.org/10.1007/978-3-030-02053-8_60
Download citation
DOI: https://doi.org/10.1007/978-3-030-02053-8_60
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-02052-1
Online ISBN: 978-3-030-02053-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)