Advertisement

Dynamic Generation of Assessment Items Using Wikidata

  • Michael StrieweEmail author
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1014)

Abstract

Automated generation of assessment items can provide large item pools for formative assessments with little effort. However, if the generation process produces self-contained items, these need to be updated or re-generated each time the data source used for generation changes. This paper describes and discusses an alternative approach that dynamically retrieves item content from Wikidata using SPARQL queries. The paper compares four different examples and discusses both benefits and limitations of this approach. Results show that the approach is usable for a broad range of different items for formative assessment scenarios and that limitations are manageable with acceptable effort.

Keywords

Assessment item generation Wikidata SPARQL 

Notes

Acknowledgements

The author would like to thank Gerwin Rajkowski for his work on implementing a first prototype for SPARQL query integration within JACK as part of his bachelor’s thesis. The German Federal Ministry of Education and Research founded parts of the research for this paper with grant number 01PL16075.

References

  1. 1.
    Gierl, M., Lai, H., Zhang, X.: Automatic item generation. In: Mehdi Khosrow-Pour, D.B.A. (Ed.) Encyclopedia of Information Science and Technology, 4th edn., pp. 2369–2379. IGI Global, Hershey (2018)Google Scholar
  2. 2.
    IMS Global Learning Consortium: IMS Question & Test Interoperability Specification. http://www.imsglobal.org/question/. Accessed 19 Oct 2018
  3. 3.
    Mostow, J., Jang, H.: Generating diagnostic multiple choice comprehension cloze questions. In: Proceedings Workshop on Innovative Use of NLP for Building Educational Applications, 136–146 (2012)Google Scholar
  4. 4.
    Al-Yahya, M.: OntoQue: a question generation engine for educational assesment based on domain ontologies. In: 11th IEEE International Conference on Advanced Learning Technologies, 393–395 (2011)Google Scholar
  5. 5.
    Gierl, M.J., Lai, H., Turner, S.R.: Using automatic item generation to create multiple-choice test items. Med. Educ. 46, 757–765 (2012)CrossRefGoogle Scholar
  6. 6.
    Lai, H., Gierl, M.J., Touchie, C., Pugh, D., Boulais, A.-P., Champlain, A.D.: Using automatic item generation to improve the quality of MCQ distractors. Teach. Learn. Med. 28, 166–173 (2016). RoutledgeCrossRefGoogle Scholar
  7. 7.
    Afzal, N., Mitkov, R.: Automatic generation of multiple choice questions using dependency-based semantic relations. Soft. Comput. 18, 1269–1281 (2014)CrossRefGoogle Scholar
  8. 8.
    Susanti, Y., Iida, R., Tokunaga, T.: Automatic generation of english vocabulary tests. In: 7th International Conference on Computer Supported Education (CSEDU), 77–87 (2015)Google Scholar
  9. 9.
    Iwane, N., Gao, C., Yoshida, M.: Question generation for learner centered learning. In: 13th IEEE International Conference on Advanced Learning Technologies, 330–332 (2013)Google Scholar
  10. 10.
    Foulonneau, M., Ras, E.: Assessment item generation, the way forward. In: CAA 2013 International Conference (2013)Google Scholar
  11. 11.
    Foulonneau, M.: Generating educational assessment items from linked open data: the case of DBpedia. In: García-Castro, R., Fensel, D., Antoniou, G. (eds.) ESWC 2011. LNCS, vol. 7117, pp. 16–27. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-25953-1_2CrossRefGoogle Scholar
  12. 12.
    Le, N.-T., Kojiri, T., Pinkwart, N.: Automatic question generation for educational applications – the state of art. In: van Do, T., Thi, H.A.L., Nguyen, N.T. (eds.) Advanced computational methods for knowledge engineering. AISC, vol. 282, pp. 325–338. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-06569-4_24CrossRefGoogle Scholar
  13. 13.
    Olney, A.M., Graesser, A.C., Person, N.K.: Question generation from concept maps. Dialogue and Discourse 3(2), 75–99 (2012)CrossRefGoogle Scholar
  14. 14.
    Le, N.-T., Nguyen, N.-P., Seta, K., Pinkwart, N.: Automatic question generation for supporting argumentation. Vietnam J Comput. Sci. 1, 117–127 (2014).  https://doi.org/10.1007/s40595-014-0014-9. Springer-Verlag New York, Inc.CrossRefGoogle Scholar
  15. 15.
    JACK website. http://www.s3.uni-duisburg-essen.de/en/jack/. Accessed 19 Oct 2018
  16. 16.
    Schwinning, N., Striewe, M., Savija, M., Goedicke, M.: On flexible multiple choice questions with parameters. In: Proceedings of the 14th European Conference on e-Learning (ECEL) (2015)Google Scholar
  17. 17.
    Schwinning, N., Kurt-Karaoglu, F., Striewe, M., Zurmaar, B., Goedicke, M.: A framework for generic exercises with mathematical content. In: Proceedings of the International Conference on Learning and Teaching in Computing and Engineering (LaTiCE), 70–75 (2015)Google Scholar
  18. 18.
    RDF website. https://www.w3.org/RDF/. Accessed 19 Oct 2018
  19. 19.
    Wikidata project. https://www.wikidata.org/wiki/Wikidata:Main_Page. Accessed 19 Oct 2018
  20. 20.
    Wikimedia Commons project. https://commons.wikimedia.org/wiki/Main_Page. Accessed 19 Oct 2018
  21. 21.
    SMILES website. http://daylight.com/smiles/. Accessed 19 Oct 2018

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of Duisburg-EssenEssenGermany

Personalised recommendations