A Dynamic and Informative Intelligent Survey System Based on Knowledge Graph

  • Patrik Bansky
  • Elspeth Edelstein
  • Jeff Z. PanEmail author
  • Adam Wyner
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12032)


In the paper we propose a dynamic and informative solution to an intelligent survey system that is based on knowledge graph. To illustrate our proposal, we focus on ordering the questions of the questionnaire component by their acceptance, along with conditional triggers that further customise participants’ experience, making the system dynamic. Evaluation of the system shows that the dynamic component can be beneficial in terms of lowering the number of questions asked and improving the quality of data, allowing more informative data to be collected in a survey of equivalent length. Fine-grained analysis allows assessment of the interaction of specific variables, as well as of individual respondents rather than just global results. The paper explores and evaluates two algorithms for the presentation of survey questions, leading to additional insights about how to improve the system .


Intelligent survey system Dynamic and informative system Linguistic grammaticality judgements 


  1. 1.
    Abernethy, J., Evgeniou, T., Vert, J.P.: An Optimization Framework for Adaptive Questionnaire Design. INSEAD, Fontainebleau (2004)Google Scholar
  2. 2.
    Callegaro, M., Wells, T., Kruse, Y.: Effects of precoding response options for five point satisfaction scales in web surveys. In: 2008 PAPOR Conference. Citeseer (2008)Google Scholar
  3. 3.
    Capterra: Survey software buyers’ guide (2019). Accessed 5 Mar 2019
  4. 4.
    Chen, T.Y., Myers, J.: Worldlikeness: a web-based tool for typological psycholinguistic research. Univ. Pennsylvania Working Pap. Linguist. 23(1), 4 (2017)Google Scholar
  5. 5.
    Dolnicar, S., Grün, B., Yanamandram, V.: Dynamic, interactive survey questions can increase survey data quality. J. Travel Tour. Mark. 30(7), 690–699 (2013)CrossRefGoogle Scholar
  6. 6.
    Drummond, A.: Ibex 0.3. 7 manual (2013)Google Scholar
  7. 7.
    Edelstein, E.: This syntax needs studied. In: Micro-syntactic variation in North American English, pp. 242–268 (2014)CrossRefGoogle Scholar
  8. 8.
    Elmes, D.G., Kantowitz, B.H., Roediger III, H.L.: Research Methods Inpsychology. Cengage Learning (2011)Google Scholar
  9. 9.
    Erlewine, M.Y., Kotek, H.: A streamlined approach to online linguistic surveys. Nat. Lang. Linguist. Theory 34(2), 481–495 (2016)CrossRefGoogle Scholar
  10. 10.
    Evans, J.R., Mathur, A.: The value of online surveys. Internet Res. 15(2), 195–219 (2005)CrossRefGoogle Scholar
  11. 11.
    Gibson, E., Piantadosi, S., Fedorenko, K.: Using mechanical turk to obtain and analyze english acceptability judgments. Lang. Linguist. Compass 5(8), 509–524 (2011)CrossRefGoogle Scholar
  12. 12.
    Guin, T.D.L., Baker, R., Mechling, J., Ruyle, E.: Myths and realities of respondent engagement in online surveys. Int. J. Mark. Res. 54(5), 613–633 (2012)CrossRefGoogle Scholar
  13. 13.
    Here, M., Now, P.: Bing, bang, bong. Blah Google Scholar
  14. 14.
    Johnson, D.R., Borden, L.A.: Participants at your fingertips: using amazons mechanical turk to increase student-faculty collaborative research. Teach. Psychol. 39(4), 245–251 (2012)CrossRefGoogle Scholar
  15. 15.
    Kaminska, O., McCutcheon, A.L., Billiet, J.: Satisficing among reluctant respondents in a cross-national context. Public Opin. Q. 74(5), 956–984 (2010)CrossRefGoogle Scholar
  16. 16.
    Katz, J.: The british-irish dialect quiz. New York Times, 15 February 2019 Google Scholar
  17. 17.
    Keller, F., Gunasekharan, S., Mayo, N., Corley, M.: Timing accuracy of web experiments: a case study using the webexp software package. Behav. Res. Methods 41(1), 1–12 (2009)CrossRefGoogle Scholar
  18. 18.
    Kropf, M.E., Blair, J.: Eliciting survey cooperation: incentives, self-interest, and norms of cooperation. Eval. Rev. 29(6), 559–575 (2005)CrossRefGoogle Scholar
  19. 19.
    Murray, T.E., Simon, B.L.: At the intersection of regional and social dialects: the case of like+ past participle in american english. Am. Speech 77(1), 32–69 (2002)CrossRefGoogle Scholar
  20. 20.
    Mwamikazi, E., Fournier-Viger, P., Moghrabi, C., Barhoumi, A., Baudouin, R.: An adaptive questionnaire for automatic identification of learning styles. In: Ali, M., Pan, J.-S., Chen, S.-M., Horng, M.-F. (eds.) IEA/AIE 2014, Part I. LNCS (LNAI), vol. 8481, pp. 399–409. Springer, Cham (2014). Scholar
  21. 21.
    Mwamikazi, E., Fournier-Viger, P., Moghrabi, C., Baudouin, R.: A dynamic questionnaire to further reduce questions in learning style assessment. In: Iliadis, L., Maglogiannis, I., Papadopoulos, H. (eds.) AIAI 2014. IAICT, vol. 436, pp. 224–235. Springer, Heidelberg (2014). Scholar
  22. 22.
    Myers, J.: Minijudge: software for small-scale experimental syntax. Int. J. Comput. Linguist. Chin. Lang. Process. 12(2), 175–194 (2007)Google Scholar
  23. 23.
    Nokelainen, P., Niemivirta, M., Kurhila, J., Miettinen, M., Silander, T., Tirri, H.: Implementation of an adaptive questionnaire. In: Proceedings of the ED-MEDIA Conference, pp. 1412–1413 (2001)Google Scholar
  24. 24.
    Ortigosa, A., Paredes, P., Rodriguez, P.: Ah-questionnaire: an adaptive hierarchical questionnaire for learning styles. Comput. Educ. 54(4), 999–1005 (2010)CrossRefGoogle Scholar
  25. 25.
    Pan, J., et al.: Reasoning Web: Logical Foundation of Knowledge Graph Construction and Querying Answering. Springer, Switzerland (2017). Scholar
  26. 26.
    Pan, J., Vetere, G., Gomez-Perez, J., Wu, H.: Exploiting Linked Data and Knowledge Graphs for Large Organisations. Springer, Switzerland (2016). Scholar
  27. 27.
    Puleston, J., Sleep, D.: The game experiments: researching how gaming techniques can be used to improve the quality of feedback from online research. In: Proceedings of ESOMAR Congress (2011)Google Scholar
  28. 28.
    Saleh, A., Bista, K.: Examining factors impacting online survey response rates in educational research: perceptions of graduate students. J. MultiDiscip. Eval. 13(29), 63–74 (2017)Google Scholar
  29. 29.
    Schlereth, C., Skiera, B.: Dise: dynamic intelligent survey engine. In: Diamantopoulos, A., Fritz, W., Hildebrandt, L. (eds.) Quantitative Marketing and Marketing Management, pp. 225–243. Gabler Verlag, Wiesbaden (2012). Scholar
  30. 30.
    Schütze, C.T.: The Empirical Base of Linguistics: Grammaticality Judgments and Linguistic Methodology. Language Science Press, Berlin (2016)CrossRefGoogle Scholar
  31. 31.
    Soares, R., Edelstein, E., Pan, J.Z., Wyner, A.: Knowledge driven intelligent survey systems for linguists. In: Ichise, R., Lecue, F., Kawamura, T., Zhao, D., Muggleton, S., Kozaki, K. (eds.) JIST 2018. LNCS, vol. 11341, pp. 3–18. Springer, Cham (2018). Scholar
  32. 32.
    SoftwareAdvice: Buyer’s guide, March 2019. Accessed 23 Apr 2019
  33. 33.
    Stoet, G.: Psytoolkit: a novel web-based method for running online questionnaires and reaction-time experiments. Teach. Psychol. 44(1), 24–31 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Patrik Bansky
    • 1
  • Elspeth Edelstein
    • 2
  • Jeff Z. Pan
    • 1
    Email author
  • Adam Wyner
    • 3
  1. 1.Department of Computing ScienceUniversity of AberdeenAberdeenUK
  2. 2.School of Language, Literature, Music and Visual CultureUniversity of AberdeenAberdeenUK
  3. 3.School of Law and Department of Computer ScienceSwansea UniversitySwanseaUK

Personalised recommendations