Using an Artificial Agent as a Behavior Model to Promote Assistive Technology Acceptance
Despite technological advancements in assistive technologies, studies show high rates of non-use. Because of the rising numbers of people with disabilities, it is important to develop strategies to increase assistive technology acceptance. The current research investigated the use of an artificial agent (embedded into a system) as a persuasive behavior model to influence individuals’ technology acceptance beliefs. Specifically, we examined the effect of agent-delivered behavior modeling vs. two non-modeling instructional methods (agent-delivered instructional narration and no agent, text-only instruction) on individuals’ computer self-efficacy and perceived ease of use of an assistive technology. Overall, the results of the study confirmed our hypotheses, showing that the use of an artificial agent as a behavioral model leads to increased computer self-efficacy and perceived ease of use of a system. The implications for the inclusion of an artificial agent as a model in promoting technology acceptance are discussed.
KeywordsPersuasive technology Artificial agents Behavior modeling Assistive technology acceptance
This work was supported by project MAMEM that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement number: 644780.
- 1.Hurst, A., Tobias, J.: Empowering individuals with do-it-yourself assistive technology. In: Thirteenth Annual ACM SIGACCESS Conference on Assistive Technologies 2011, pp. 11–18. ACM Press, Dundee (2011)Google Scholar
- 2.Eurostat. http://ec.europa.eu/eurostat/statistics-explained/index.php/Disability_statistics_-_need_for_assistance. Accessed 09 Nov 2017
- 3.Kintsch, A., DePaula, R.: A framework for the adoption of assistive technology. In: Supporting Learning Through Assistive Technology, SWAAAC 2002, pp. 1–10 (2002)Google Scholar
- 4.Wessel, R., Dijcks, B., Soede, M., Gelderblom, G.J., De Witte, L.: Non-use of provided assistive technology devices, a literature overview. Technol. Disability 15(4), 231–238 (2003)Google Scholar
- 5.Hamari, J., Koivisto, J., Pakkanen, T.: Do persuasive technologies persuade? - a review of empirical studies. In: Spagnolli, A., Chittaro, L., Gamberini, L. (eds.) PERSUASIVE 2014. LNCS, vol. 8462, pp. 118–136. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07127-5_11CrossRefGoogle Scholar
- 7.Reeves, B., Nass, C.: The Media Equation. Cambridge University Press, New York (1996)Google Scholar
- 9.Bandura, A.: Social Foundations of Thought and Action: A Social Cognitive Theory. Prentice-Hall, New York (1986)Google Scholar
- 10.Bandura, A., Freeman, W.H., Lightsey, R.: Self-efficacy: the exercise of control. J. Cogn. Psychother. 13(2), 158–166 (1999)Google Scholar
- 18.Baylor, A.L., Plant, E.A.: Pedagogical agents as social models for engineering: the influence of agent appearance on female choice. In: Looi, C.K., McCalla, G., Bredeweg, B., Breuker, J. (eds.) Artificial Intelligence in Education: Supporting Learning Through Intelligent and Socially Informed Technology 2005, vol. 125, pp. 65–72. IOS Press, Amsterdam (2005)Google Scholar
- 23.Bartneck, C., Croft, E., Kulic, D.: Measuring the anthropomorphism, animacy, likeability, perceived intelligence and perceived safety of robots. In: Metrics for HRI Workshop 2008, Technical report, vol. 471, Amsterdam, pp. 37–44 (2008)Google Scholar
- 24.Hayes, A.F.: Introduction to Mediation, Moderation, and Conditional Process Analysis. Guilford Press, New York (2013)Google Scholar
- 25.Ayres, P., Sweller, J.: The split‐attention principle. In: Mayer, R.E. (ed.) Cambridge Handbook of Multimedia Learning, pp. 135–146. Cambridge University Press, New York (2005)Google Scholar