Using an Artificial Agent as a Behavior Model to Promote Assistive Technology Acceptance

  • Sofia Fountoukidou
  • Jaap Ham
  • Uwe Matzat
  • Cees Midden
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10809)


Despite technological advancements in assistive technologies, studies show high rates of non-use. Because of the rising numbers of people with disabilities, it is important to develop strategies to increase assistive technology acceptance. The current research investigated the use of an artificial agent (embedded into a system) as a persuasive behavior model to influence individuals’ technology acceptance beliefs. Specifically, we examined the effect of agent-delivered behavior modeling vs. two non-modeling instructional methods (agent-delivered instructional narration and no agent, text-only instruction) on individuals’ computer self-efficacy and perceived ease of use of an assistive technology. Overall, the results of the study confirmed our hypotheses, showing that the use of an artificial agent as a behavioral model leads to increased computer self-efficacy and perceived ease of use of a system. The implications for the inclusion of an artificial agent as a model in promoting technology acceptance are discussed.


Persuasive technology Artificial agents Behavior modeling Assistive technology acceptance 



This work was supported by project MAMEM that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement number: 644780.


  1. 1.
    Hurst, A., Tobias, J.: Empowering individuals with do-it-yourself assistive technology. In: Thirteenth Annual ACM SIGACCESS Conference on Assistive Technologies 2011, pp. 11–18. ACM Press, Dundee (2011)Google Scholar
  2. 2.
  3. 3.
    Kintsch, A., DePaula, R.: A framework for the adoption of assistive technology. In: Supporting Learning Through Assistive Technology, SWAAAC 2002, pp. 1–10 (2002)Google Scholar
  4. 4.
    Wessel, R., Dijcks, B., Soede, M., Gelderblom, G.J., De Witte, L.: Non-use of provided assistive technology devices, a literature overview. Technol. Disability 15(4), 231–238 (2003)Google Scholar
  5. 5.
    Hamari, J., Koivisto, J., Pakkanen, T.: Do persuasive technologies persuade? - a review of empirical studies. In: Spagnolli, A., Chittaro, L., Gamberini, L. (eds.) PERSUASIVE 2014. LNCS, vol. 8462, pp. 118–136. Springer, Cham (2014). Scholar
  6. 6.
    Ham, J., Midden, C.J.: A persuasive robot to stimulate energy conservation: the influence of positive and negative social feedback and task similarity on energy-consumption behavior. Int. J. Soc. Robot. 6(2), 163–171 (2014)CrossRefGoogle Scholar
  7. 7.
    Reeves, B., Nass, C.: The Media Equation. Cambridge University Press, New York (1996)Google Scholar
  8. 8.
    Fuhrer, M.J., Jutai, J.W., Scherer, M.J., DeRuyter, F.: A framework for the conceptual modelling of assistive technology device outcomes. Disability Rehabil. 25(22), 1243–1251 (2003)CrossRefGoogle Scholar
  9. 9.
    Bandura, A.: Social Foundations of Thought and Action: A Social Cognitive Theory. Prentice-Hall, New York (1986)Google Scholar
  10. 10.
    Bandura, A., Freeman, W.H., Lightsey, R.: Self-efficacy: the exercise of control. J. Cogn. Psychother. 13(2), 158–166 (1999)Google Scholar
  11. 11.
    Marakas, G.M., Yi, M.Y., Johnson, R.D.: The multilevel and multifaceted character of computer self-efficacy: toward clarification of the construct and an integrative framework for research. Inf. Syst. Res. 9(2), 126–163 (1998)CrossRefGoogle Scholar
  12. 12.
    Venkatesh, V., Bala, H.: Technology acceptance model 3 and a research agenda on interventions. Decis. Sci. 39(2), 273–315 (2008)CrossRefGoogle Scholar
  13. 13.
    Agarwal, R., Sambamurthy, V., Stair, R.M.: The evolving relationship between general and specific computer self-efficacy—an empirical assessment. Inf. Syst. Res. 11(4), 418–430 (2000)CrossRefGoogle Scholar
  14. 14.
    Compeau, D.R., Higgins, C.A.: Computer self-efficacy: development of a measure and initial test. MIS Q. 19(2), 189–211 (1995)CrossRefGoogle Scholar
  15. 15.
    Venkatesh, V., Davis, F.D.: A model of the antecedents of perceived ease of use: development and test. Decis. Sci. 27(3), 451–481 (1996)CrossRefGoogle Scholar
  16. 16.
    Gist, M.E.: The influence of training method on self-efficacy and idea generation among managers. Pers. Psychol. 42(4), 787–805 (1989)CrossRefGoogle Scholar
  17. 17.
    Kim, Y., Baylor, A.L.: A socio-cognitive framework for pedagogical agents as learning companions. Educ. Technol. Res. Dev. 54(6), 569–596 (2006)CrossRefGoogle Scholar
  18. 18.
    Baylor, A.L., Plant, E.A.: Pedagogical agents as social models for engineering: the influence of agent appearance on female choice. In: Looi, C.K., McCalla, G., Bredeweg, B., Breuker, J. (eds.) Artificial Intelligence in Education: Supporting Learning Through Intelligent and Socially Informed Technology 2005, vol. 125, pp. 65–72. IOS Press, Amsterdam (2005)Google Scholar
  19. 19.
    Plant, E.A., Baylor, A.L., Doerr, C.E., Rosenberg-Kima, R.B.: Changing middle-school students’ attitudes and performance regarding engineering with computer-based social models. Comput. Educ. 53(2), 209–215 (2009)CrossRefGoogle Scholar
  20. 20.
    Kumar, C., Menges, R., Staab, S.: Eye-controlled interfaces for multimedia interaction. IEEE Multimedia 23(4), 6–13 (2016)CrossRefGoogle Scholar
  21. 21.
    Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13(3), 319–340 (1989)CrossRefGoogle Scholar
  22. 22.
    Davis, F.D.: User acceptance of information technology: system characteristics, user perceptions and behavioral impacts. Int. J. Man-Mach. Stud. 38(3), 475–487 (1993)CrossRefGoogle Scholar
  23. 23.
    Bartneck, C., Croft, E., Kulic, D.: Measuring the anthropomorphism, animacy, likeability, perceived intelligence and perceived safety of robots. In: Metrics for HRI Workshop 2008, Technical report, vol. 471, Amsterdam, pp. 37–44 (2008)Google Scholar
  24. 24.
    Hayes, A.F.: Introduction to Mediation, Moderation, and Conditional Process Analysis. Guilford Press, New York (2013)Google Scholar
  25. 25.
    Ayres, P., Sweller, J.: The split‐attention principle. In: Mayer, R.E. (ed.) Cambridge Handbook of Multimedia Learning, pp. 135–146. Cambridge University Press, New York (2005)Google Scholar
  26. 26.
    Venkatesh, V.: Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the technology acceptance model. Inf. Syst. Res. 11(4), 342–365 (2000)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Sofia Fountoukidou
    • 1
  • Jaap Ham
    • 1
  • Uwe Matzat
    • 1
  • Cees Midden
    • 1
  1. 1.Human-Technology InteractionEindhoven University of TechnologyEindhovenThe Netherlands

Personalised recommendations