Advertisement

Subjective Preferences Towards Various Conditions of Self-Administered Questionnaires: AHP and Conjoint Analyses

  • Rafał Michalski
  • Marta Staniów
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10918)

Abstract

Using questionnaires for eliciting data from respondents has a long term history. The present paper focuses on subjects’ preferences towards specific self-administered questionnaire designs and circumstances in which these experiments are carried out. The paper examines three factors, that is, the assistant presence (yes, no), survey form (paper or electronic), and scale type (visual analogue or Likert). A pairwise comparison technique was employed to obtain participants’ opinions. Calculations of the relative preferences were performed according to the Analytic Hierarchy Process (AHP) methodology. The conjoint methodology employed in this study provided partial utilities of the examined factor levels and relative importances for the effects. Apart from verifying the statistical significance of the investigated factors, the analysis of variance revealed also possible interactions between them.

Keywords

Questionnaire design Subjects’ preferences Survey form Scale type Surveyor presence 

References

  1. Aldridge, A., Levine, K.: Surveying The Social World: Principles and Practice in Survey Research. Open University Press, Buckingham (2001).  https://doi.org/10.1046/j.1365-2648.2002.2138c.xCrossRefGoogle Scholar
  2. Andreadis, I.: Web surveys optimized for smartphones: are there differences between computer and smartphone users? Methods Data Anal. 9(2), 16 (2015).  https://doi.org/10.12758/mda.2015.012MathSciNetCrossRefGoogle Scholar
  3. Barzilai, J.: Deriving weights from pairwise comparison matrices. J. Oper. Res. Soc. 48, 1226–1232 (1997)CrossRefGoogle Scholar
  4. Bowling, A.: Mode of questionnaire administration can have serious effects on data quality. J. Public Health 27(3), 281–291 (2005).  https://doi.org/10.1093/pubmed/fdi031CrossRefGoogle Scholar
  5. Bourque, L., & Fielder, E. (2003). How to Conduct Self-Administered and Mail Surveys. 2455 Teller Road,Thousand OaksCalifornia 91320 United States of America: SAGE Publications, Inc.  https://doi.org/10.4135/9781412984430
  6. Burns, K.E.A., Duffett, M., Kho, M.E., Meade, M.O., Adhikari, N.K.J., Sinuff, T., Cook, D.J.: A guide for the design and conduct of self-administered surveys of clinicians. CMAJ : Can. Med. Assoc. J. 179(3), 245–252 (2008).  https://doi.org/10.1503/cmaj.080372CrossRefGoogle Scholar
  7. D’Ambra, J., Rice, R.E., O’Connor, M.: Computer-mediated communication and media preference: an investigation of the dimensionality of perceived task equivocality and media richness. Behav. Inf. Technol. 17(3), 164–174 (1998).  https://doi.org/10.1080/014492998119535CrossRefGoogle Scholar
  8. Dillman, D.: Mail and other self-administered surveys in the 21st century: the beginning of a new Era. Gallup Res. J. 2(1), 121–140 (1999)Google Scholar
  9. Fink, A.: How to conduct surveys: a step-by-step guide, 6th edn. SAGE Publications, Thousand Oaks (2015)Google Scholar
  10. Green, P.E., Srinivasan, V.: Conjoint analysis in consumer research: issues and outlook. J. Consum. Res. 5, 103–123 (1978)CrossRefGoogle Scholar
  11. Green, P.E., Srinivasan, V.: Conjoint analysis in marketing: new developments with implications for research and practice. J. Mark. 54(4), 3–19 (1990)CrossRefGoogle Scholar
  12. Green, P.E., Krieger, A.M., Wind, Y.: Thirty years of conjoint analysis: reflections and prospects. Interfaces 31(2), 56–73 (2001)CrossRefGoogle Scholar
  13. Grobelny, J., Michalski, R.: Various approaches to a human preference analysis in a digital signage display design. Hum. Factors Ergon. Manuf. Serv. Ind. 21(6), 529–542 (2011).  https://doi.org/10.1002/hfm.20295CrossRefGoogle Scholar
  14. Hox, J.J., Leeuw, E.D.D.: A comparison of nonresponse in mail, telephone, and face-to-face surveys. Qual. Quant. 28(4), 329–344 (1994).  https://doi.org/10.1007/BF01097014CrossRefGoogle Scholar
  15. Koczkodaj, W.: Testing the accuracy enhancement of pairwise comparisons by a Monte Carlo experiment. J. Stat. Plann. Infer. 69(1), 21–31 (1998)CrossRefGoogle Scholar
  16. Krantz, D.H., Tversky, A.: Conjoint measurement analysis of composition rules in psychology. Psychol. Rev. 78, 151–169 (1971)CrossRefGoogle Scholar
  17. van Laerhoven, H., van der Zaag-Loonen, H., Derkx, B.: A comparison of Likert scale and visual analogue scales as response options in children’s questionnaires. Acta Paediatr. 93(6), 830–835 (2004).  https://doi.org/10.1111/j.1651-2227.2004.tb03026.xCrossRefGoogle Scholar
  18. Luce, D.R., Tukey, J.W.: Simultaneous conjoint measurement: a new type of fundamental measurement. J. Math. Psychol. 1, 1–27 (1964)CrossRefGoogle Scholar
  19. Lewis, J.R., Erdinç, O.: User experience rating scales with 7, 11, or 101 points: does it matter? J. Usability Stud. 12(2), 73–91 (2017). http://uxpajournal.org/user-experience-rating-scales-points/
  20. Saaty, T.L.: A scaling method for priorities in hierarchical structures. J. Math. Psychol. 15(3), 234–281 (1977).  https://doi.org/10.1016/0022-2496(77)90033-5MathSciNetCrossRefMATHGoogle Scholar
  21. Saaty, T.L.: The Analytic Hierarchy Process. McGraw Hill, New York (1980)MATHGoogle Scholar
  22. Toepoel, V., Ludtig, P.: Online surveys are mixed-device surveys. Issues associated with the use of different (mobile) devices in web surveys. Methods Data Anal. 9(2), 8 (2015).  https://doi.org/10.12758/mda.2015.009
  23. Tourangeau, R., Rips, L.J., Rasinski, K.: The Psychology of Survey Response, pp. 289–312. Cambridge University Press, Cambridge (2000). Chapter 10: Mode of data collectionGoogle Scholar
  24. Voutilainen, A.: Meta-analysis: complex relationships between patient satisfaction, age and item-level response rate. J. Res. Nurs. 21(8), 611–620 (2016).  https://doi.org/10.1177/1744987116655595CrossRefGoogle Scholar
  25. Voutilainen, A., Pitkäaho, T., Kvist, T., Vehviläinen-Julkunen, K.: How to ask about patient satisfaction? The visual analogue scale is less vulnerable to confounding factors and ceiling effect than a symmetric Likert scale. J. Adv. Nurs. 72(4), 946–957 (2016).  https://doi.org/10.1111/jan.12875CrossRefGoogle Scholar
  26. Wright, B., Ogbuehi, A.:. Surveying adolescents: the impact of data collection methodology on response quality. Electron. J. Bus. Res. Methods 12 (1), 41–53 (2014). http://www.ejbrm.com/issue/download.html?idArticle=333
  27. Yu, J., Cooper, H.: A quantitative review of research design effects on response rates to questionnaires. J. Mark. Res. 20(1), 36–44 (1983).  https://doi.org/10.2307/3151410CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Faculty of Computer Science and ManagementWrocław University of Science and TechnologyWrocławPoland

Personalised recommendations