Advertisement

Technical Support: Towards Mitigating Effects of Computer Anxiety on Acceptance of E-Assessment Amongst University Students in Sub Saharan African Countries

  • Kayode I. AdenugaEmail author
  • Victor W. Mbarika
  • Zacchaeus O. Omogbadegun
Conference paper
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 558)

Abstract

The application of Information technology in educational context and environment has dramatically changed the pattern at which people teach and learn. Institutions of higher learning globally are increasingly adopting e-Assessment as a replacement for traditional pen on paper examination due to its cost effectiveness, improved reliability due to machine marking, accurate and timely assessment. In spite of the numerous benefits of e-assessment, it is unclear if University students in Sub Saharan African Countries are willing to accept it. The purpose of this study is to examine technical support role towards mitigating effects of computer anxiety on electronic assessment amongst University students in Nigeria and Cameroon. Therefore, the study extended Technology Acceptance Model and was validated using 102 responses collected randomly across universities in Nigeria and Cameroon. This study supports the body of knowledge by establishing that Computer Anxiety is an important factor which can affect University students regardless of their level of computer proficiency. The outcome of the proposed model indicated that when technical assistance is provided during e-Assessment, computer anxiety on majority of University students in Nigeria and Cameroon is reduced. The practical implication of this study is that students’ actual academic potentials may not be seen if education policy makers and University administrators do not always strive to ensure that all measures, including technical support that can reduce fear associated with use of computer for assessment, are introduced.

Keywords

E-Learning E-Assessment Anxiety Computer anxiety 

References

  1. 1.
    Gilbert, L., Whitelock, D., Gale, V.: Synthesis report on assessment and feedback with technology enhancement (2011)Google Scholar
  2. 2.
    Llamas-Nistal, M., et al.: Blended e-assessment: migrating classical exams to the digital world. Comput. Educ. 62, 72–87 (2013)Google Scholar
  3. 3.
    Alruwais, N., Wills, G., Wald, M.: Identifying factors that affect the acceptance and use of E-assessment by academics in Saudi Universities. IJAEDU-Int. E-J. Adv. Educ. 2(4), 132–140 (2016)Google Scholar
  4. 4.
    Dhar, D., Yammiyavar, P.: A cross-cultural study of navigational mechanisms in computer based assessment environment. Procedia Comput. Sci. 45, 862–871 (2015)Google Scholar
  5. 5.
    Conti-Ramsden, G., Durkin, K., Walker, A.J.: Computer anxiety: a comparison of adolescents with and without a history of specific language impairment (SLI). Comput. Educ. 54(1), 136–145 (2010)Google Scholar
  6. 6.
    Nurcan, A.: Identifying factors that affect students’ acceptance of web-based assessment tools within the context of higher education. M.Sc. dissertation. Midlle East Technical University. Retrieved from Middle East Technical University Digital Thesis (2010)Google Scholar
  7. 7.
    Putwain, D.W., Daniels, R.A.: Is the relationship between competence beliefs and test anxiety influenced by goal orientation? Learn. Individ. Differ. 20(1), 8–13 (2010)Google Scholar
  8. 8.
    Jimoh, R., et al.: Acceptability of Computer Based Testing (CBT) Mode for Undergraduate Courses in Computer Science (2013)Google Scholar
  9. 9.
    Sieber, V., Young, D.: Factors associated with the successful introduction of on-line diagnostic, formative and summative assessment in the Medical Sciences Division University of Oxford (2008)Google Scholar
  10. 10.
    Ndunagu, J., Agbasonu, V.C., Ihem, F.C.: E-assessment of bi-weekly report: a case study of National Orientation Agency (NOA), Imo State, Nigeria. West Afr. J. Ind. Acad. Res. 14(1), 49–60 (2015)Google Scholar
  11. 11.
    Beckers, J.J., Wicherts, J.M., Schmidt, H.G.: Computer anxiety: “Trait” or “state”? Comput. Hum. Behav. 23(6), 2851–2862 (2007)Google Scholar
  12. 12.
    Farzin, S., Dahlan, H.M.: Proposing a model to predict students’ perception towards adopting an e-assessment system. J. Theor. Appl. Inf. Technol. 90(1), 144–153 (2016)Google Scholar
  13. 13.
    Lachman, S.J.: Learning is a process: toward an improved definition of learning. J. Psychol. 131(5), 477–480 (1997)Google Scholar
  14. 14.
    De Houwer, J., Barnes-Holmes, D., Moors, A.: What is learning? On the nature and merits of a functional definition of learning. Psychon. Bull. Rev. 20(4), 631–642 (2013)Google Scholar
  15. 15.
    Ausubel, D.P., Novak, J.D., Hanesian, H.: Educational Psychology: A Cognitive View (1968)Google Scholar
  16. 16.
    Imtiaz, M.A., Maarop, N.: A review of technology acceptance studies in the field of education. Jurnal Teknologi 69(2), 27–32 (2014)Google Scholar
  17. 17.
    Terzis, V., Economides, A.A.: The acceptance and use of computer based assessment. Comput. Educ. 56(4), 1032–1044 (2011)Google Scholar
  18. 18.
    Mason, O., Grove-Stephensen, I.: Automated free text marking with paperless school (2002)Google Scholar
  19. 19.
    Bennett, R.E.: Inexorable and inevitable: the continuing story of technology and assessment. Comput.-Based Test. Internet: Issues Adv. 1, 201–217 (2006)Google Scholar
  20. 20.
    Siozos, P., et al.: Computer based testing using “digital ink”: participatory design of a tablet PC based assessment application for secondary education. Comput. Educ. 52(4), 811–819 (2009)Google Scholar
  21. 21.
    Mohamadi, Z.: Comparative effect of online summative and formative assessment on EFL student writing ability. Stud. Educ. Eval. 59, 29–40 (2018)Google Scholar
  22. 22.
    Deutsch, T., et al.: Implementing computer-based assessment–a web-based mock examination changes attitudes. Comput. Educ. 58(4), 1068–1075 (2012)Google Scholar
  23. 23.
    Singleton, C.: Computer-based assessment in education. Educ. Child Psychol. 18(3), 58–74 (2001)Google Scholar
  24. 24.
    Darkwa, O., Mazibuko, F.: Virtual learning communities in Africa: challenges and prospects. FirstMonday (2002)Google Scholar
  25. 25.
    Unwin, T.: Survey of e-Learning in Africa. E-Learn. UNESCO Chair in ICT for Development, Royal Holloway, University of London, UK, pp. 1–10 (2008)Google Scholar
  26. 26.
    Nwana, S.: Challenges in the applications of e-learning by secondary school teachers in Anambra State, Nigeria. Afr. J. Teach. Educ. 2(1), 1–9 (2012)Google Scholar
  27. 27.
    Ajadi, T.O., Salawu, I.O., Adeoye, F.A.: E-learning and distance education in Nigeria. Online Submission 7(4), 1–10 (2008)Google Scholar
  28. 28.
    Osang, F.: Electronic examination in Nigeria, academic staff perspective—case study: National Open University of Nigeria (NOUN). Int. J. Inf. Educ. Technol. 2(4), 304–307 (2012)Google Scholar
  29. 29.
    Abubakar, A.S., Adebayo, F.O.: Using computer based test method for the conduct of examination in Nigeria: prospects, challenges and strategies. Mediterr. J. Soc. Sci. 5(2), 47 (2014)Google Scholar
  30. 30.
    Adomi, E.E., Kpangban, E.: Application of ICTs in Nigerian secondary schools. Library Philosophy and Practice (2010)Google Scholar
  31. 31.
    Simonson, M.R., et al.: Development of a standardized test of computer literacy and a computer anxiety index. J. Educ. Comput. Res. 3(2), 231–247 (1987)Google Scholar
  32. 32.
    Laguna, K., Babcock, R.L.: Computer anxiety in young and older adults: implications for human-computer interactions in older populations. Comput. Hum. Behav. 13(3), 317–326 (1997)Google Scholar
  33. 33.
    Rosen, L.D., Maguire, P.: Myths and realities of computerphobia: a meta-analysis. Anxiety Res. 3(3), 175–191 (1990)Google Scholar
  34. 34.
    Daly, C., Waldron, J.: Introductory programming, problem solving and computer assisted assessment (2002)Google Scholar
  35. 35.
    Babo, R.B., Azevedo, A.I., Suhonen, J.: Students’ perceptions about assessment using an e-learning platform. In: 2015 IEEE 15th International Conference on Advanced Learning Technologies. IEEE (2015)Google Scholar
  36. 36.
    Chien, S.-P., Wu, H.-K., Hsu, Y.-S.: An investigation of teachers’ beliefs and their use of technology-based assessments. Comput. Hum. Behav. 31, 198–210 (2014)Google Scholar
  37. 37.
    Taylor, S., Todd, P.: Decomposition and crossover effects in the theory of planned behavior: a study of consumer adoption intentions. Int. J. Res. Mark. 12(2), 137–155 (1995)Google Scholar
  38. 38.
    Fluck, A.: State wide adoption of e-assessments. In: Ensuring Quality and Standards for e-Assessments in Tertiary Education: Redefining Innovative Assessment in the Digital Age (2012)Google Scholar
  39. 39.
    Fluck, A.E., Mogey, N.: Comparison of institutional innovation: two universities’ nurturing of computer-based examinations. In: 10th IFIP World Conference on Computers in Education (2013)Google Scholar
  40. 40.
    Al-Qeisi, K., et al.: Website design quality and usage behavior: unified theory of acceptance and use of technology. J. Bus. Res. 67(11), 2282–2290 (2014)Google Scholar
  41. 41.
    Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13(3), 319–340 (1989)Google Scholar
  42. 42.
    Davis, F.D., Bagozzi, R.P., Warshaw, P.R.: User acceptance of computer technology: a comparison of two theoretical models. Manag. Sci. 35(8), 982–1003 (1989)Google Scholar
  43. 43.
    Robey, D.: Research commentary: diversity in information systems research: threat, promise, and responsibility. Inf. Syst. Res. 7(4), 400–408 (1996)Google Scholar
  44. 44.
    Malhotra, Y., Galletta, D.F.: Extending the technology acceptance model to account for social influence: theoretical bases and empirical validation. In: Proceedings of the 32nd Annual Hawaii International Conference on Systems Sciences, HICSS-32. IEEE (1999)Google Scholar
  45. 45.
    Venkatesh, V., et al.: User acceptance of information technology: toward a unified view. MIS Q. 27(3), 425–478 (2003)Google Scholar
  46. 46.
    Szajna, B.: Empirical evaluation of the revised technology acceptance model. Manag. Sci. 42(1), 85–92 (1996)Google Scholar
  47. 47.
    Park, S.Y.: An analysis of the technology acceptance model in understanding university students’ behavioral intention to use e-learning. Educ. Technol. Soc. 12(3), 150–162 (2009)Google Scholar
  48. 48.
    Van Raaij, E.M., Schepers, J.J.: The acceptance and use of a virtual learning environment in China. Comput. Educ. 50(3), 838–852 (2008)Google Scholar
  49. 49.
    Sun, P.-C., et al.: What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 50(4), 1183–1202 (2008)Google Scholar
  50. 50.
    Terzis, V., Moridis, C.N., Economides, A.A.: The effect of emotional feedback on behavioral intention to use computer based assessment. Comput. Educ. 59(2), 710–721 (2012)Google Scholar
  51. 51.
    Terzis, V., Moridis, C.N., Economides, A.A.: Continuance acceptance of computer based assessment through the integration of user’s expectations and perceptions. Comput. Educ. 62, 50–61 (2013)Google Scholar
  52. 52.
    Kalogeropoulos, N., et al.: Computer-based assessment of student performance in programing courses. Comput. Appl. Eng. Educ. 21(4), 671–683 (2013)MathSciNetGoogle Scholar
  53. 53.
    Bhuasiri, W., et al.: Critical success factors for e-learning in developing countries: a comparative analysis between ICT experts and faculty. Comput. Educ. 58(2), 843–855 (2012)Google Scholar
  54. 54.
    Venkatesh, V., et al.: Individual reactions to new technologies in the workplace: the role of gender as a psychological construct. J. Appl. Soc. Psychol. 34(3), 445–467 (2004)Google Scholar
  55. 55.
    Venkatesh, V., Brown, S.A., Bala, H.: Bridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systems. MIS Q. 37(1), 21–54 (2013)Google Scholar
  56. 56.
    Agarwal, R., Karahanna, E.: Time flies when you’re having fun: cognitive absorption and beliefs about information technology usage. MIS Q. 24(4), 665–694 (2000)Google Scholar
  57. 57.
    Ong, C.-S., Lai, J.-Y.: Gender differences in perceptions and relationships among dominants of e-learning acceptance. Comput. Hum. Behav. 22(5), 816–829 (2006)Google Scholar
  58. 58.
    Tarus, J.K., Gichoya, D., Muumbo, A.: Challenges of implementing e-learning in Kenya: a case of Kenyan public universities. Int. Rev. Res. Open Distrib. Learn. 16(1), 120–141 (2015)Google Scholar
  59. 59.
    Saidu, A., Clarkson, M.A., Mohammed, M.: E-Learning Security Challenges, Implementation and Improvement in Developing Countries: A Review (2016)Google Scholar
  60. 60.
    Compeau, D.R., Higgins, C.A.: Computer self-efficacy: development of a measure and initial test. MIS Q. 19(2), 189–211 (1995)Google Scholar
  61. 61.
    Agarwal, R., Sambamurthy, V., Stair, R.M.: The evolving relationship between general and specific computer self-efficacy—an empirical assessment. Inf. Syst. Res. 11(4), 418–430 (2000)Google Scholar
  62. 62.
    Venkatesh, V., Davis, F.D.: A model of the antecedents of perceived ease of use: development and test. Decis. Sci. 27(3), 451–481 (1996)Google Scholar
  63. 63.
    Oye, N., Iahad, A., Rabin, A.: Behavioral intention to accept and use ICT in public universities: integrating quantitative and qualitative data. J. Emerg. Trends Comput. Inf. Sci. 3(6), 957–969 (2012)Google Scholar
  64. 64.
    Rana, N.P., Dwivedi, Y.K.: Citizen’s adoption of an e-government system: validating extended social cognitive theory (SCT). Gov. Inf. Q. 32(2), 172–181 (2015)Google Scholar
  65. 65.
    Mayr, S., et al.: A short tutorial of GPower. Tutorials Quant. Methods Psychol. 3(2), 51–59 (2007)Google Scholar
  66. 66.
    Erdfelder, E., Faul, F., Buchner, A.: GPOWER: a general power analysis program. Behav. Res. Methods Instrum. Comput. 28(1), 1–11 (1996)Google Scholar
  67. 67.
    Schreuder, H.T., Gregoire, T.G., Weyer, J.P.: For what applications can probability and non-probability sampling be used? Environ. Monit. Assess. 66(3), 281–291 (2001)Google Scholar
  68. 68.
    Kasunic, M.: Designing an effective survey. Carnegie-Mellon Univ Pittsburgh PA Software Engineering Inst (2005)Google Scholar
  69. 69.
    Ahmad, S., Afthanorhan, W.M.A.B.W.: The importance-performance matrix analysis in partial least square structural equation modeling (PLS-SEM) with smartpls 2.0 M3. Int. J. Math. Res. 3(1), 1–14 (2014)Google Scholar
  70. 70.
    Urbach, N., Ahlemann, F.: Structural equation modeling in information systems research using partial least squares. JITTA: J. Inf. Technol. Appl. 11(2), 5–40 (2010)Google Scholar
  71. 71.
    Sitzia, J.: How valid and reliable are patient satisfaction data? An analysis of 195 studies. Int. J. Qual. Health Care 11(4), 319–328 (1999)Google Scholar
  72. 72.
    Hair, J.F., et al.: An assessment of the use of partial least squares structural equation modeling in marketing research. J. Acad. Mark. Sci. 40(3), 414–433 (2012)Google Scholar
  73. 73.
    Kline, R.B.: Principles and Practice of Structural Equation Modeling. Guilford Publications, New York (2015)zbMATHGoogle Scholar
  74. 74.
    Hair, J.F., Ringle, C.M., Sarstedt, M.: PLS-SEM: indeed a silver bullet. J. Mark. Theory Pract. 19(2), 139–152 (2011)Google Scholar
  75. 75.
    Santos, J.R.A.: Cronbach’s alpha: a tool for assessing the reliability of scales. J. Extension 37(2), 1–5 (1999)Google Scholar
  76. 76.
    Peterson, R.A.: A meta-analysis of Cronbach’s coefficient alpha. J. Consum. Res. 21(2), 381–391 (1994)Google Scholar
  77. 77.
    Nunnally, J.C., Bernstein, I.H., Berge, J.M.F.: Psychometric Theory, vol. 226. McGraw-Hill, New York (1967)Google Scholar
  78. 78.
    Ozturk, M.A.: Confirmatory factor analysis of the educators’ attitudes toward educational research scale. Educ. Sci.: Theory Pract. 11(2), 737–748 (2011)Google Scholar
  79. 79.
    Nazari, J.A., et al.: Organizational culture, climate and IC: an interaction analysis. J. Intellect. Capital 12(2), 224–248 (2011)Google Scholar
  80. 80.
    Fornell, C., Larcker, D.F.: Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 18(1), 39–50 (1981)Google Scholar
  81. 81.
    Hulland, J.: Use of partial least squares (PLS) in strategic management research: a review of four recent studies. Strateg. Manag. J. 20(2), 195–204 (1999)Google Scholar
  82. 82.
    Wong, K.K.-K.: Partial least squares structural equation modeling (PLS-SEM) techniques using SmartPLS. Mark. Bull. 24(1), 1–32 (2013)Google Scholar
  83. 83.
    Barclay, D., Higgins, C., Thompson, R.: The partial least squares (PLS) approach to casual modeling: personal computer adoption and use as an Illustration (1995)Google Scholar
  84. 84.
    Ringle, C.M., Sarstedt, M., Straub, D.: A critical look at the use of PLS-SEM in MIS Quarterly. MIS Q. (MISQ) 36(1), 3–14 (2012)Google Scholar
  85. 85.
    Chin, W.W.: The partial least squares approach to structural equation modeling. Mod. Methods Bus. Res. 295(2), 295–336 (1998)Google Scholar
  86. 86.
    Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. CRC Press, Boca Raton (1994)zbMATHGoogle Scholar
  87. 87.
    Hair Jr., J.F., et al.: A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM). Sage Publications, Thousand Oaks (2016)Google Scholar
  88. 88.
    Gudergan, S.P., et al.: Confirmatory tetrad analysis in PLS path modeling. J. Bus. Res. 61(12), 1238–1249 (2008)Google Scholar
  89. 89.
    Terzis, V., Economides, A.A.: Computer based assessment: gender differences in perceptions and acceptance. Comput. Hum. Behav. 27(6), 2108–2122 (2011)Google Scholar
  90. 90.
    Teo, T.: A path analysis of pre-service teachers’ attitudes to computer use: applying and extending the technology acceptance model in an educational context. Interact. Learn. Environ. 18(1), 65–79 (2010)MathSciNetGoogle Scholar
  91. 91.
    Padilla-Meléndez, A., Garrido-Moreno, A., Del Aguila-Obra, A.R.: Factors affecting e-collaboration technology use among management students. Comput. Educ. 51(2), 609–623 (2008)Google Scholar
  92. 92.
    Padilla-Meléndez, A., del Aguila-Obra, A.R., Garrido-Moreno, A.: Perceived playfulness, gender differences and technology acceptance model in a blended learning scenario. Comput. Educ. 63, 306–317 (2013)Google Scholar
  93. 93.
    Rana, N.P., Dwivedi, Y.K.: Citizen’s adoption of an e-government system: validating extended social cognitive theory (SCT). Govern. Inf. Q. 32(2), 172–181 (2015)Google Scholar
  94. 94.
    Rana, N.P., Dwivedi, Y.K., Williams, M.D., Weerakkody, V.: Adoption of online public grievance redressal system in India: Toward developing a unified view. Comput. Hum. Behav. 59, 265–282 (2016)Google Scholar
  95. 95.
    Rana, N.P., Dwivedi, Y.K., Lal, B., Williams, M.D., Clement, M.: Citizens’ adoption of an electronic government system: towards a unified view. Inf. Syst. Front. 19(3), 549–568 (2017)Google Scholar
  96. 96.
    Dwivedi, Y.K., Wade, M.R., Schneberger, S.L. (eds.): Information Systems Theory: Explaining and Predicting Our Digital Society, vol. 1. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-1-4419-6108-2Google Scholar
  97. 97.
    Dwivedi, Y.K., Mustafee, N., Carter, L.D., Williams, M.D.: A bibliometric comparision of the usage of two theories of IS/IT acceptance (TAM and UTAUT). In: AMCIS 2010 Proceedings, Paper #183 (2010). http://aisel.aisnet.org/amcis2010/183

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.School of ICTICT University (USA)YaoundeCameroon
  2. 2.Southern University and A&M CollegeBaton RougeUSA

Personalised recommendations