Human Collaboration on Crowdsourcing Platforms – a Content Analysis

  • Navid TavanapourEmail author
  • Eva A. C. Bittner
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11589)


The crowdsourcing phenomenon offers the opportunity to address an open call to the crowd. Crowd workers may work together to find a solution that satisfies the open call. One of the major benefits for a crowdsourcer is the pool of crowd workers that can be accessed over crowdsourcing platforms. However, the produced outcomes of crowd workers are often on a low level with weak elaboration and quality. The key to high quality work is the collaboration of crowd workers. This has already been addressed in the collaboration process design framework for crowdsourcing (CPDF). At this point we position this work and widen our view by conducting a content analysis on crowdsourcing platforms in order to understand the collaboration of crowd workers on real world crowdsourcing platforms better as well as investigate the weaknesses of the collaboration process design framework for crowdsourcing to improve work practices. By doing so, we redesign the CPDF based on the results of the content analysis and present an improved collaboration process of crowd workers within the CPDF.


Human collaboration Crowdsourcing Content analysis Design science research Crowd work CPDF 


  1. 1.
    Leimeister, J.M., Durward, D., Zogaj, S.: Crowd worker in deutschland. Hans-Böckler-Stiftung, Düsseldorf (2016). (in German)Google Scholar
  2. 2.
    Howe, J.: The rise of crowdsourcing. Wired Mag. 14(6), 1–4 (2006)Google Scholar
  3. 3.
    Howe, J.: Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business. Crown Business, New York (2008)Google Scholar
  4. 4.
    Durward, D., Blohm, I., Leimeister, J.M.: Crowd work. Bus. Inf. Syst. Eng. 58, 281–286 (2016)CrossRefGoogle Scholar
  5. 5.
    Bowers, C.A., Pharmer, J.A., Salas, E.: When member homogeneity is needed in work teams a meta-analysis. Small Group Res. 31, 305–327 (2000)CrossRefGoogle Scholar
  6. 6.
    Bittner, E.A.C., Leimeister, J.M.: Creating shared understanding in heterogeneous work groups: why it matters and how to achieve it. J. Manag. Inf. Syst. 31, 111–144 (2014)CrossRefGoogle Scholar
  7. 7.
    Langan-Fox, J., Anglim, J., Wilson, J.R.: Mental models, team mental models, and performance: Process, development, and future directions. Hum. Factors Ergon. Manuf. 14, 331–352 (2004)CrossRefGoogle Scholar
  8. 8.
    Wegge, J., Roth, C., Neubach, B., Schmidt, K.H., Kanfer, R.: Age and gender diversity as determinants of performance and health in a public organization: the role of task complexity and group size. J. Appl. Psychol. 93, 1301–1313 (2008)CrossRefGoogle Scholar
  9. 9.
    Agafonovas, A., Alonderiene, R.: value creation in innovations crowdsourcing: example of creative agencies. Organ. Markets Emerg. Econ. 2, 32 (2013)Google Scholar
  10. 10.
    Tavanapour, N., Bittner, E.A.C.: Collaboration among crowdsourcees: towards a design theory for collaboration process design. In: Proceedings of the 50th Hawaii International Conference on System Sciences, Hawaii (2017)Google Scholar
  11. 11.
    Tavanapour, N., Bittner, E.A.C.: The Collaboration of crowd workers. In: Proceedings of the 26th European Conference on Information Systems, Portsmouth, UK (2018)Google Scholar
  12. 12.
    Kittur, A., et al.: The future of crowd work. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work, pp. 1301–1318. ACM, New York (2013)Google Scholar
  13. 13.
    Pedersen, J., et al.: Conceptual foundations of crowdsourcing: a review of IS research. In: 46th Hawaii International Conference on System Sciences (HICSS), Hawaii (2013)Google Scholar
  14. 14.
    Kipp, P.: Engineering Tool Supported Collaboration Processes for Web-based Platforms. Research on IT / Service / Innovation / Collaboration. 9 (2015)Google Scholar
  15. 15.
    Hutter, K., Hautz, J., Füller, J., Mueller, J., Matzler, K.: Communitition: the tension between competition and collaboration in community-based design contests. Creativity Innov. Manag. 20, 3–21 (2011)CrossRefGoogle Scholar
  16. 16.
    Malhotra, A., Majchrzak, A.: Managing Crowds in Innovation Challenges. Calif. Manag. Rev. 56, 103–123 (2014)CrossRefGoogle Scholar
  17. 17.
    Grant, R.M.: Prospering in dynamically-competitive environments: organizational capability as knowledge integration. Organ. Sci. 7, 375–387 (1996)CrossRefGoogle Scholar
  18. 18.
    Mayring, P.: Qualitative content analysis: theoretical foundation, basic procedures and software solution, Klagenfurt (2014)Google Scholar
  19. 19.
    Mayring, P., Brunner, E.: Qualitative inhaltsanalyse. In: Buber, R., Holzmüller, H.H. (eds.) Qualitative Marktforschung. Gabler, pp. 669–680. Springer, Heidelberg (2009). Scholar
  20. 20.
    Coffey, A., Atkinson, P.: Making Sense of Qualitative Data: Complementary Research Strategies. Sage Publications, Thousand Oaks (1996)Google Scholar
  21. 21.
    Simon, H.A.: The Sciences of the Artificial. MIT Press, Cambridge (1996)Google Scholar
  22. 22.
    Gregor, S.: The nature of theory in information systems. MIS Q. 30, 611–642 (2006)CrossRefGoogle Scholar
  23. 23.
    Gregor, S., Hevner, A.R.: Positioning and presenting design science research for maximum impact. MIS Q. 37, 337–355 (2013)CrossRefGoogle Scholar
  24. 24.
    Gregor, S., Jones, D.: The anatomy of a design theory. J. AIS 8, 19 (2007)Google Scholar
  25. 25.
    Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A design science research methodology for information systems research. J. Manag. Inf. Syst. 24, 45–77 (2007)CrossRefGoogle Scholar
  26. 26.
    Andersen, R., Mørch, A.I.: Mutual development in mass collaboration: identifying interaction patterns in customer-initiated software product development. Comput. Hum. Behav. 65, 77–91 (2016)CrossRefGoogle Scholar
  27. 27.
    Dow, S., Kulkarni, A., Bunge, B., Nguyen, T., Klemmer, S., Hartmann, B.: Shepherding the crowd. In: Tan, D. (ed.) Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, p. 1669. ACM, New York (2011)Google Scholar
  28. 28.
    Mok, R.K.P., Li, W., Chang, R.K.C.: Detecting low-quality crowdtesting workers. In: IEEE 23rd International Symposium on Quality of Service (IWQoS), pp. 201–206. IEEE, Piscataway (2015)Google Scholar
  29. 29.
    Peng, X., Ali Babar, M., Ebert, C.: Collaborative software development platforms for crowdsourcing. IEEE Softw. 31, 30–36 (2014)CrossRefGoogle Scholar
  30. 30.
    Skopik, F., Schall, D., Dustdar, S.: Discovering and managing social compositions in collaborative enterprise crowdsourcing systems. Int. J. Coop. Inf. Syst. 21, 297–341 (2012)CrossRefGoogle Scholar
  31. 31.
    Yang, P., Zhang, N., Zhang, S., Yang, K., Yu, L., Shen, X.: Identifying the most valuable workers in fog-assisted spatial crowdsourcing. IEEE Internet Things J. 4, 1193–1203 (2017)CrossRefGoogle Scholar
  32. 32.
    Ankolekar, A., Balestrieri, F.E., Asur, S.: MET: an enterprise market for tasks. In: Gergle, D., Morris, M.R., Bjørn, P., Konstan, J. (eds.) ACM Conference on Computer-Supported Cooperative Work and Social Computing and Association for Computing Machinery and CSCW. CSCW 2016, pp. 225–228, New York (2016)Google Scholar
  33. 33.
    Tian, F., Liu, B., Sun, X., Zhang, X., Cao, G., Lin, G.: Movement-based incentive for crowdsourcing. IEEE Trans. Veh. Technol. 66(8), 7223–7233 (2017)CrossRefGoogle Scholar
  34. 34.
    Hassan, U., Curry, E.: A capability requirements approach for predicting worker performance in crowdsourcing. In: Bertino, E., Georgakopoulos, D., Srivatsa, M., Nepal, S., Vinciarelli, A. (eds.) 2013 9th International Conference on Collaborative Computing: Networking, Applications and Worksharing (Collaboratecom). IEEE, Piscataway (2013)Google Scholar
  35. 35.
    Hirth, M., Scheuring, S., Hossfeld, T., Schwartz, C., Tran-Gia, P.: Predicting result quality in crowdsourcing using application layer monitoring. In: 2014 IEEE Fifth International Conference on Communications and Electronics (ICCE), pp. 510–515 (2014)Google Scholar
  36. 36.
    Li, G., Wang, J., Zheng, Y., Franklin, M.J.: Crowdsourced data management: a survey. IEEE Trans. Knowl. Data Eng. 28, 2296–2319 (2016)CrossRefGoogle Scholar
  37. 37.
    Naderi, B., Wechsung, I., Moller, S.: Effect of being observed on the reliability of responses in crowdsourcing micro-task platforms. In: Seventh International Workshop on Quality of Multimedia Experience (QoMEX), pp. 1–2. IEEE, Piscataway (2015)Google Scholar
  38. 38.
    Abhinav, K., Dubey, A., Jain, S., Bhatia, G.K., McCartin, B., Bhardwaj, N.: Crowdassistant: a virtual buddy for crowd worker. In: IEEE/ACM 5th International Workshop on Crowd Sourcing in Software Engineering (CSI-SE), pp. 17–20 (2018)Google Scholar
  39. 39.
    Jiang, L., Wagner, C., Nardi, B.: Not just in it for the money: a qualitative investigation of workers’ perceived benefits of micro-task crowdsourcing. In: Bui, T.X., Sprague, R.H. (eds.) 48th Hawaii International Conference on System Sciences (HICSS), pp. 773–782. IEEE, Piscataway (2015)Google Scholar
  40. 40.
    Chittilappilly, A.I., Chen, L., Amer-Yahia, S.: A survey of general-purpose crowdsourcing techniques. IEEE Trans. Knowl. Data Eng. 28, 2246–2266 (2016)CrossRefGoogle Scholar
  41. 41.
    Frey, K., Lüthje, C., Haag, S.: Whom should firms attract to open innovation platforms? The role of knowledge diversity and motivation. Long Range Plan. 44, 397–420 (2011)CrossRefGoogle Scholar
  42. 42.
    Schultheiss, D., Blieske, A., Solf, A., Staeudtner, S.: How to encourage the crowd? A study about user typologies and motivations on crowdsourcing platforms. In: IEEE/ACM 6th International Conference on Utility and Cloud Computing (UCC), pp. 506–509. IEEE, Piscataway (2013)Google Scholar
  43. 43.
    Hossain, M.: Users’ motivation to participate in online crowdsourcing platforms. In: Proceedings of the 2012 International Conference on Innovation Management and Technology Research, Malacca, Malaysia (2012)Google Scholar
  44. 44.
    Soliman, W., Tuunainen, V.K.: Understanding continued use of crowdsourcing systems: an interpretive study. J. Theor. Appl. Electron. Commer. Res. 10, 1–18 (2015)CrossRefGoogle Scholar
  45. 45.
    Dai, W., Wang, Y., Jin, Q., Ma, J.: An integrated incentive framework for mobile crowdsourced sensing. Tsinghua Sci. Technol. 21, 146–156 (2016)CrossRefGoogle Scholar
  46. 46.
    Faradani, S., Hartmann, B., Ipeirotis, P.G.: What’s the right price? Pricing tasks for finishing on time. Hum. Comput. 11, 11 (2011)Google Scholar
  47. 47.
    Mason, W., Watts, D.J.: Financial incentives and the performance of crowds. ACM SIGKDD Explor. Newsl. 11, 100–108 (2010)CrossRefGoogle Scholar
  48. 48.
    Wu, H., Corney, J., Grant, M.: Relationship between quality and payment in crowdsourced design. In: Hou, J.L. (ed.) Proceedings of the 2014 IEEE 18th International Conference on Computer Supported Cooperative Work in Design (CSCWD), pp. 499–504. IEEE, Piscataway (2014)Google Scholar
  49. 49.
    Xie, H., Lui, J.C.S., Jiang, J.W., Chen, W.: Incentive mechanism and protocol design for crowdsourcing systems. In: 52nd Annual Allerton Conference on Communication, Control, and Computing (Allerton), 2014, pp. 140–147. IEEE, Piscataway (2014)Google Scholar
  50. 50.
    Zhang, Y., van der Schaar, M.: Reputation-based incentive protocols in crowdsourcing applications. In: Proceedings of the 2012 IEEE INFOCOM, pp. 2140–2148. IEEE, Piscataway (2012)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of HamburgHamburgGermany

Personalised recommendations