The Power of the Crowd: Performing Usability Testing Using an On-Demand Workforce

Conference paper

Abstract

For many business organizations, Web sites are an important tool for attracting new customers or to advertise or sell their products and services. Therefore, any problems in Web site usability can directly influence a company’s bottom line. Nevertheless, despite this importance, examples of poorly designed Web pages abound, and often, such poor design is blamed on not having paid attention to thorough usability testing. Whereas even a few tests are better than none at all, especially smaller organizations lack the resources to perform multiple rounds of in-depth user studies. Recently, crowdsourcing has become a popular way of recruiting an on-demand workforce for a variety of tasks. Our study demonstrates how crowdsourcing can be used as a relatively easy and inexpensive way to recruit participants for usability tests and suggests ways to overcome the lacking possibility to observe participants during the testing process. Further, we demonstrate that the crowds were able to detect at least as many usability problems as experts asked to evaluate the same sites.

Keywords

Harness 

Notes

Acknowledgments

The work described in this chapter was substantially supported by a research grant from City University of Hong Kong (Project No. 7200147).

References

  1. Agarwal R, Venkatesh V (2002) Assessing a firm’s web presence: a heuristic evaluation procedure for the measurement of usability. Inf Syst Res 13(2):168–186CrossRefGoogle Scholar
  2. Atterer R, Wnuk M, Schmidt A (2006) Knowing the user’s every move: user activity tracking for website usability evaluation and implicit interaction. In: Proceedings of the 15th international conference on World Wide Web. ACM, New York, pp 203–212CrossRefGoogle Scholar
  3. Berinsky AJ, Huber GA, Lez GS (2010) Using mechanical turk as a subject recruitment tool for experimental research. Retrieved from http://huber.research.yale.edu/materials/26_paper.pdf
  4. Buhrmester M, Kwang T, Gosling SD (2011) Amazon’s mechanical turk. Perspect Psychol Sci 6(1):3–5CrossRefGoogle Scholar
  5. Flanders V Web pages that suck – learn good web design by looking at bad web design. In: Web pages that suck. Retrieved 18 Apr 2011, from http://www.webpagesthatsuck.com/
  6. Garrett JJ (2011) The elements of user experience: user-centered design for the web and beyond. New Riders, BerkeleyGoogle Scholar
  7. Howe J (2006) The rise of Crowdsourcing. Retrieved 18 Apr 2011, from http://www.wired.com/wired/archive/14.06/crowds.html
  8. Howe J (2009) Crowdsourcing: why the power of the crowd is driving the future of business, 2nd edn. Crown Publishing Group, New YorkGoogle Scholar
  9. Kalbach J (2007) Designing Web navigation. O’Reilly Media, Inc, SebastopolGoogle Scholar
  10. Karat J (1997) Evolving the scope of user-centered design. Commun ACM 40(7):33–38CrossRefGoogle Scholar
  11. Kim SE, Shaw T, Schneider H (2003) Web site design benchmarking within industry groups. Internet Res 13(1):17–26CrossRefGoogle Scholar
  12. Kittur A, Kraut RE (2008) Harnessing the wisdom of crowds in Wikipedia: quality through coordination. In: Proceedings of the 2008 international ACM conference on supporting group work, San Diego, CA. ACM, New York, pp 37–46Google Scholar
  13. Krug S (2006) Don’t make me think! A common sense approach to web usability, 2nd edn. New Riders, BerkeleyGoogle Scholar
  14. Loiacono ET, Watson RT, Goodhue DL (2007) WebQual: an instrument for consumer evaluation of web sites. Int J Electr Commer 11(3):51–87CrossRefGoogle Scholar
  15. Mason W, Suri S (2010) Conducting behavioral research on Amazon’s mechanical turk. SSRN eLibrary. Retrieved 18 Feb 2011, from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1691163
  16. Mason W, Watts DJ (2010) Financial incentives and the “performance of crowds”. ACM SIGKDD Explor Newslett 11:100–108CrossRefGoogle Scholar
  17. Mueller F, Lockerd A (2001) Cheese: tracking mouse movement activity on websites, a tool for user modeling. In: CHI’01 extended abstracts on human factors in computing systems. ACM, New York, pp 279–280CrossRefGoogle Scholar
  18. Nielsen J Usability testing with 5 Users. Jakob Nielsen’s Alertbox. Retrieved 18 Apr 2011, from http://www.useit.com/alertbox/20000319.html
  19. Oppenheimer DM, Meyvis T, Davidenko N (2009) Instructional manipulation checks: detecting satisficing to increase statistical power. J Exp Soc Psychol 45(4):867–872CrossRefGoogle Scholar
  20. Page SE (2007) The difference: how the power of diversity creates better groups, firms, schools, and societies. Princeton University Press, PrincetonGoogle Scholar
  21. Raymond ES (2001) The cathedral and the bazaar: musings on Linux and open source by an accidental revolutionary. O’Reilly, SebastopolGoogle Scholar
  22. Spool J, Schroeder W (2001) Testing web sites: five users is nowhere near enough. In: CHI’01 extended abstracts on human factors in computing systems. ACM, New York, pp 285–286CrossRefGoogle Scholar
  23. Surowiecki J (2004) The wisdom of crowds. Doubleday, New YorkGoogle Scholar
  24. Valacich JS, Parboteeah DV, Wells JD (2007) The online consumer’s hierarchy of needs. Commun ACM 50:84–90CrossRefGoogle Scholar
  25. Venkatesh V, Agarwal R (2006) Turning visitors into customers: a usability-centric perspective on purchase behavior in e-channels. Manage Sci 52(3):367–382CrossRefGoogle Scholar
  26. Wodtke C, Govella A (2009) Information architecture: blueprints for the Web. New Riders, BerkeleyGoogle Scholar
  27. Woolrych A, Cockton G (2001) Why and when five test users aren’t enough. In: Proceedings of the IHM-HCI 2001 conference, vol 2. Cépadèus Éditions, Toulouse, pp 105–108Google Scholar
  28. Zittrain J (2008) Ubiquitous human computing. Philos Trans R Soc A Math Phy Eng Sci 366(1881):3813–3821CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Department of Information SystemsCity University of Hong KongKowloonHong Kong SAR

Personalised recommendations