Skip to main content

Dynamic Generation of Assessment Items Using Wikidata

  • Conference paper
  • First Online:
Technology Enhanced Assessment (TEA 2018)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1014))

Included in the following conference series:

Abstract

Automated generation of assessment items can provide large item pools for formative assessments with little effort. However, if the generation process produces self-contained items, these need to be updated or re-generated each time the data source used for generation changes. This paper describes and discusses an alternative approach that dynamically retrieves item content from Wikidata using SPARQL queries. The paper compares four different examples and discusses both benefits and limitations of this approach. Results show that the approach is usable for a broad range of different items for formative assessment scenarios and that limitations are manageable with acceptable effort.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.wikidata.org/wiki/Q727.

  2. 2.

    Item live-demo: https://jack-demo.s3.uni-due.de/jack2/demo?id=63144.

  3. 3.

    https://www.wikidata.org/wiki/Q515.

  4. 4.

    https://www.wikidata.org/wiki/Q839954.

  5. 5.

    https://www.wikidata.org/wiki/Q182547.

  6. 6.

    https://www.wikidata.org/wiki/Property:P131.

  7. 7.

    Item live-demo: https://jack-demo.s3.uni-due.de/jack2/demo?id=63140.

  8. 8.

    https://www.wikidata.org/wiki/Q11173.

  9. 9.

    https://www.wikidata.org/wiki/Property:P117.

  10. 10.

    https://www.wikidata.org/wiki/Property:P233.

  11. 11.

    Item live-demo: https://jack-demo.s3.uni-due.de/jack2/demo?id=63131

  12. 12.

    https://www.wikidata.org/wiki/Q458.

  13. 13.

    https://www.wikidata.org/wiki/Property:P2046.

  14. 14.

    https://www.wikidata.org/wiki/Property:P1082.

  15. 15.

    https://www.wikidata.org/wiki/Property:P463.

  16. 16.

    https://www.wikidata.org/wiki/Property:P36.

  17. 17.

    Item live-demo: https://jack-demo.s3.uni-due.de/jack2/demo?id=62162.

  18. 18.

    https://www.wikidata.org/wiki/Property:P31.

  19. 19.

    https://www.wikidata.org/wiki/Q1221156.

  20. 20.

    https://www.wikidata.org/wiki/Property:P6.

  21. 21.

    https://www.wikidata.org/wiki/Property:P576.

References

  1. Gierl, M., Lai, H., Zhang, X.: Automatic item generation. In: Mehdi Khosrow-Pour, D.B.A. (Ed.) Encyclopedia of Information Science and Technology, 4th edn., pp. 2369–2379. IGI Global, Hershey (2018)

    Google Scholar 

  2. IMS Global Learning Consortium: IMS Question & Test Interoperability Specification. http://www.imsglobal.org/question/. Accessed 19 Oct 2018

  3. Mostow, J., Jang, H.: Generating diagnostic multiple choice comprehension cloze questions. In: Proceedings Workshop on Innovative Use of NLP for Building Educational Applications, 136–146 (2012)

    Google Scholar 

  4. Al-Yahya, M.: OntoQue: a question generation engine for educational assesment based on domain ontologies. In: 11th IEEE International Conference on Advanced Learning Technologies, 393–395 (2011)

    Google Scholar 

  5. Gierl, M.J., Lai, H., Turner, S.R.: Using automatic item generation to create multiple-choice test items. Med. Educ. 46, 757–765 (2012)

    Article  Google Scholar 

  6. Lai, H., Gierl, M.J., Touchie, C., Pugh, D., Boulais, A.-P., Champlain, A.D.: Using automatic item generation to improve the quality of MCQ distractors. Teach. Learn. Med. 28, 166–173 (2016). Routledge

    Article  Google Scholar 

  7. Afzal, N., Mitkov, R.: Automatic generation of multiple choice questions using dependency-based semantic relations. Soft. Comput. 18, 1269–1281 (2014)

    Article  Google Scholar 

  8. Susanti, Y., Iida, R., Tokunaga, T.: Automatic generation of english vocabulary tests. In: 7th International Conference on Computer Supported Education (CSEDU), 77–87 (2015)

    Google Scholar 

  9. Iwane, N., Gao, C., Yoshida, M.: Question generation for learner centered learning. In: 13th IEEE International Conference on Advanced Learning Technologies, 330–332 (2013)

    Google Scholar 

  10. Foulonneau, M., Ras, E.: Assessment item generation, the way forward. In: CAA 2013 International Conference (2013)

    Google Scholar 

  11. Foulonneau, M.: Generating educational assessment items from linked open data: the case of DBpedia. In: García-Castro, R., Fensel, D., Antoniou, G. (eds.) ESWC 2011. LNCS, vol. 7117, pp. 16–27. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-25953-1_2

    Chapter  Google Scholar 

  12. Le, N.-T., Kojiri, T., Pinkwart, N.: Automatic question generation for educational applications – the state of art. In: van Do, T., Thi, H.A.L., Nguyen, N.T. (eds.) Advanced computational methods for knowledge engineering. AISC, vol. 282, pp. 325–338. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-06569-4_24

    Chapter  Google Scholar 

  13. Olney, A.M., Graesser, A.C., Person, N.K.: Question generation from concept maps. Dialogue and Discourse 3(2), 75–99 (2012)

    Article  Google Scholar 

  14. Le, N.-T., Nguyen, N.-P., Seta, K., Pinkwart, N.: Automatic question generation for supporting argumentation. Vietnam J Comput. Sci. 1, 117–127 (2014). https://doi.org/10.1007/s40595-014-0014-9. Springer-Verlag New York, Inc.

    Article  Google Scholar 

  15. JACK website. http://www.s3.uni-duisburg-essen.de/en/jack/. Accessed 19 Oct 2018

  16. Schwinning, N., Striewe, M., Savija, M., Goedicke, M.: On flexible multiple choice questions with parameters. In: Proceedings of the 14th European Conference on e-Learning (ECEL) (2015)

    Google Scholar 

  17. Schwinning, N., Kurt-Karaoglu, F., Striewe, M., Zurmaar, B., Goedicke, M.: A framework for generic exercises with mathematical content. In: Proceedings of the International Conference on Learning and Teaching in Computing and Engineering (LaTiCE), 70–75 (2015)

    Google Scholar 

  18. RDF website. https://www.w3.org/RDF/. Accessed 19 Oct 2018

  19. Wikidata project. https://www.wikidata.org/wiki/Wikidata:Main_Page. Accessed 19 Oct 2018

  20. Wikimedia Commons project. https://commons.wikimedia.org/wiki/Main_Page. Accessed 19 Oct 2018

  21. SMILES website. http://daylight.com/smiles/. Accessed 19 Oct 2018

Download references

Acknowledgements

The author would like to thank Gerwin Rajkowski for his work on implementing a first prototype for SPARQL query integration within JACK as part of his bachelor’s thesis. The German Federal Ministry of Education and Research founded parts of the research for this paper with grant number 01PL16075.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Striewe .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Striewe, M. (2019). Dynamic Generation of Assessment Items Using Wikidata. In: Draaijer, S., Joosten-ten Brinke, D., Ras, E. (eds) Technology Enhanced Assessment. TEA 2018. Communications in Computer and Information Science, vol 1014. Springer, Cham. https://doi.org/10.1007/978-3-030-25264-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-25264-9_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-25263-2

  • Online ISBN: 978-3-030-25264-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics