Skip to main content

The Future of Usability Evaluation: Increasing Impact on Value

  • Chapter
Maturing Usability

Part of the book series: Human-Computer Interaction Series ((HCIS))

Abstract

What does the future of usability evaluation hold? To gain insights for the future, this chapter first surveys past and current usability practices, including laboratory usability testing, heuristic evaluation, methods with roots in anthropology (such as contextual inquiry and ethnographic research), rapid iterative testing, benchmarking with large population samples, and multiple-method usability programs. Such consideration has several benefits, because both individual usability practitioners and organizations have attained different levels of usability sophistication and maturity. Usability evaluation methods long employed by major corporations may still be in the future for smaller or younger organizations. The chapter begins by discussing 20th-century usability evaluation, continues with an overview of usability evaluation today, and concludes with a discussion of what to expect in usability evaluation over the next years. For each period in the history—and future—of usability evaluation, the chapter addresses how its impact on software value is increasing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Anschuetz, L., Hinderer, D., & Rohn, J. (1998, June). When the field is far afield: multiple-country observations of complex system use. In: Proceedings from UPA 1998: Capitalizing on usability, Washington, DC, USA.

    Google Scholar 

  • Bailey, J.E. & Pearson, S.W. (1983). Development of a tool for measuring and analyzing computer user satisfaction. Management Science, 29, 530–545.

    Article  Google Scholar 

  • Bart, I.Y., Shankar, V. Sultan, F., & Urban, G.L. (2005). Are the drivers and role of online trust the same for all web sites and consumers? A large scale exploratory empirical study. MIT Sloan Research Paper No. 4578–05.

    Google Scholar 

  • Beyer, H. & Holtzblatt, K. (1998). Contextual design: defining customer-centered systems. San Francisco, California, USA: Morgan Kaufmann Publishers Inc.

    Google Scholar 

  • Bjerknes, G., Ehn, P., & Kyng, M. (1987). Computers and democracy—a scandinavian challenge. Avebury: Aldershot.

    Google Scholar 

  • Bjorn-Andersen, N. & Hedberg, B. (1977). Designing information systems in an organizational perspective. Studies in the Management Sciences Prescriptive Models of Organizations, 5, 125–142.

    Google Scholar 

  • Blom, J., Chipchase, J., & Lehikoinen, J. (2005). Contextual and cultural challenges. Communications of the ACM, 48, 37–41.

    Article  Google Scholar 

  • Boren, T., & Ramey, J. (2000). Thinking aloud: reconciling theory and practice. IEEE Transactions on Professional Communication, 43(3), 261–278.

    Article  Google Scholar 

  • Borriello, G. (2005, January). RFID: facts, fiction, and trends. Proceedings from the Center for Internet Studies, Pressing Questions of the Information Age. Seattle, WA, USA.

    Google Scholar 

  • Bugental, J.O., Turner, S., & Rosenbaum, S. (2006, June). Overlapping usability and market research: synergies and issues. Proceedings from the UPA 2006: Usability in Storytelling, Broomfield, CO, USA.

    Google Scholar 

  • Canny, J. (2006). The future of human-computer interaction. ACM Queue,4(6), 24–32.

    Article  Google Scholar 

  • Cockton, G. (2004, April). From quality in use to value in the world. Proceedings from CHI 2004 Conference on Human Factors in Computing System, Vienna, Austria.

    Google Scholar 

  • Consolvo, S., Everitt, K., Smith, I., & Landay, J.A. (2006, April). Design requirements for technologies that encourage physical activity. Proceedings from CHI 2006: Designing for Tangible Interactions. Montreal, Quebec, Canada.

    Google Scholar 

  • Davis, J.P., Steury, K., & Pagulayan, R. (2005). A survey method for assessing perceptions of a game: the consumer playtest in game design. Game Studios: The International Journal of Computer Game Research, 5(1).

    Google Scholar 

  • De Jong, M. & Schellens, P.J. (2000). Toward a document evaluation methodology: what does research tell us about the validity and reliability of evaluation methods? IEEE Transactions on Professional Communication, 43(3), 242–260.

    Article  Google Scholar 

  • Dumas, J. & Redish, J. (1993). A practical guide to usability testing(1st ed.). Westport, Connecticut, USA: Greenwood Publishing Group Inc.

    Google Scholar 

  • Dumas, J. & Redish, J. (1999). A practical guide to usability testing (revised ed.). Fishponds, Bristol, UK: Intellect LTD.

    Google Scholar 

  • Fogg, B.J. & Tseng, H. (1999, May). The elements of computer credibility. Proceedings from the ACM SIGCHI Conference on Human Factors in Computer Systems, Pittsburgh, PA, USA.

    Google Scholar 

  • Gould, J.D. & Lewis, C. (1985). Designing for usability: key principles and what designers think. Communications of the ACM, 28(3), 300–311.

    Article  Google Scholar 

  • Gray, W.D., Atwood, M.E., Fisher, C., Nielsen, J., Carroll, J.M., & Long, J. (1995, May). Discount or disservice? Discount usability analysis – evaluation at a bargain price or simply damaged merchandise? Proceedings from the ACM CHI’95 Conference on Human Factors in Computing Systems, Denver, CO, USA.

    Google Scholar 

  • Gray, W.D. & Salzman, M.C. (1998). Damaged merchandise? A review of experiments that compare usability evaluation methods. Human-Computer Interaction, 13(3), 203–261.

    Article  Google Scholar 

  • Hix, D. & Hartson, H.R. (1993). Developing user interfaces — ensuring usability through product & process. New York: John Wiley & Sons, Inc.

    MATH  Google Scholar 

  • Holtzblatt, K. & Jones, S. (1993). Contextual inquiry: a participatory technique for system design. In D. Schuler & A. Namioka (Eds.), Participatory Design: Principles and Practices (pp. 177–210). New Jersey: Lawrence Erlbaum.

    Google Scholar 

  • Kantner, L. & Rosenbaum, S. (1997, October). Usability studies of www sites: heuristic evaluation vs. laboratory testing. Proceedings from the ACM SIGDOC Conference: Crossroads in Communication, Salt Lake City, UT, USA.

    Google Scholar 

  • Kantner, L. & Keirnan, T. (2003, June). Field research in commercial product development. Proceedings from UPA 2003: Ubiquitous Usability. Scottsdale, AZ, USA.

    Google Scholar 

  • Kantner, L., Sova, D., & Rosenbaum, S. (2003, October). Alternative methods for field usability research. Proceedings from ACM SIGDOC 2003: Finding Real-World Solutions for Documentation: How Theory Informs Practice and Practice Informs Theory, San Francisco, CA, USA.

    Google Scholar 

  • Kantner, L., Sova, D., & Anschuetz, L. (2005, June). Organizing qualitative data from lab and field: challenges and methods. Proceedings from UPA 2005: Bridging Cultures, Montreal, Quebec, Canada.

    Google Scholar 

  • Kantner, L., Goold, S., Danis, M., Nowak, M., & Monroe-Gatrell, L. (2006, April). Web tool for health insurance design by small groups: usability study. Proceedings from CHI 2006: Conference on Human Factors in Computing Systems: Interact. Inform. Inspire., Montreal, Quebec, Canada.

    Google Scholar 

  • Kirakowski, J. (1987, March). The computer user satisfaction inventory. Proceedings from the IEE: Evaluation Techniques for Interactive System Design, London, England.

    Google Scholar 

  • Kirakowski, J. & Corbett, M. (1993). SUMI: the software usability measurement inventory. British Journal of Educational Technology, 24(3), 210–212.

    Article  Google Scholar 

  • Kirakowski, J., Claridge, N., & Whitehead, R. (1998, June). Human centered measures of s uccess in website design. Proceedings of the Fourth Conference on Human Factors and the Web. Basking Ridge, NJ, USA.

    Google Scholar 

  • Mantei, M.M. & Teorey, T.J. (1988). Cost/benefit analysis for incorporating human factors in the software life-cycle. Communications of the ACM, 31(4), 428–439.

    Article  Google Scholar 

  • Medlock, M., Wixon, D., McGee, M., & Welsh, D. (2005). The rapid iterative test and evaluation method: better products in less time. In R. Bias & D. Mayhew (Eds.), Cost-Justifying Usability: An Update for the Information Age (pp. 489–517). San Francisco, California, USA: Morgan Kaufman.

    Chapter  Google Scholar 

  • Muller, M.J. (1991, April). PICTIVEAn exploration in participatory design. Proceedings of the SIGCHI Conference: Conference on Human Factors in Computing Systems: Reaching Through Technology, New Orleans, LA, USA.

    Google Scholar 

  • Muller, M.J. (1995, May). Diversity and depth in participatory design; working with mosaic of users and other stakeholders in the software development lifecycle. Proceedings from CHI’95 Mosaic of Creativity. Denver, CO, USA.

    Google Scholar 

  • Muller, M.J., & Carr, R. (1996). Using the CARD and PICTIVE participatory design methods for collaborative analysis. In D. Wixon & J. Ramey (Eds.), Field Methods Casebook for Software Design (pp. 17–34). New York, NY, USA: John Wiley & Sons, Inc.

    Google Scholar 

  • Nielsen, J. (1993). Usability engineering. New York: Academic Press, Inc.

    MATH  Google Scholar 

  • Nielsen J. & Landauer, T.K. (1993, April). A mathematical model of the finding of usability problems. Proceedings from the ACM INTERCHI Conference: Human Factors in Computing Systems, Amsterdam, The Netherlands.

    Google Scholar 

  • O’Donovan, J. & Smyth, B. (2005, January). Trust in recommender systems. Proceedings of the 10th International Conference on Intelligent User Interfaces, San Diego, CA, USA.

    Google Scholar 

  • Page, C. (2005). Mobile research strategies for a global market. Communications of the ACM, 48, 42–48.

    Article  Google Scholar 

  • Paulk, M.C., Weber, C.V., Curtis, B., & Chrissis, M.B. (1995). Capability maturity model, the: guidelines for improving the software process. Boston: Addison-Wesley Professional.

    Google Scholar 

  • Ramey, J. & Robinson, C. (1991, October). Video-based task analysis: a tool for understanding your audience. Proceedings from IPCC 91: The Engineered Communication, Orlando, FL, USA.

    Google Scholar 

  • Ramey, J., Boren, T., Cuddihy, E., Dumas, J., Guan, Z., van den Haak, M.J., & De Jong, M.D.T. (2006, April). Does think aloud work? How do we know? Proceedings from CHI 2006: Conference on Human Factors in Computing Systems: Interact. Inform. Inspire., Montreal, Quebec, Canada.

    Google Scholar 

  • Raven, M.E. & Flanders, A. (1996). Using contextual inquiry to learn about your audience. ACM SIGDOC Journal of Computer Documentation, 20(1).

    Google Scholar 

  • Rosenbaum, S. (1989). Usability evaluations versus usability testing: when and why? IEEE Transactions on Professional Communication, 32, 210–216.

    Article  Google Scholar 

  • Rosenbaum, S. (2000a, August). Not just a hammer: when and how to employ multiple methods in usability programs. Proceedings from the UPA 2000 Conference. Asheville, NC, USA.

    Google Scholar 

  • Rosenbaum, S. (2000b). Making usability research usable. In K. Kaasgaard (Ed.), Software Design & Usability. Copenhagen: Copenhagen Business School Press.

    Google Scholar 

  • Rosenbaum, S. (2003). Stalking the user. Intercom, the Magazine of the Society of Technical Communication, 4–6.

    Google Scholar 

  • Rosenbaum, S., Rohn, J., & Humburg, J. (2000, April). A toolkit for strategic usability: results from workshops, panels, and surveys. Proceedings from the SIGCHI Conference: Human Factors in Computing Systems, The Hague, The Netherlands.

    Google Scholar 

  • Royce, W.W. (1970, August). Managing the development of large software systems: concepts and techniques. Technical papers from the IEEE WESCON Western Electronic Show & Convention. Los Angeles, CA, USA.

    Google Scholar 

  • Shneiderman, B. (1987). Designing the user interface: strategies for effective human-computer interaction. Reading, MA, USA: Addison Wesley.

    Google Scholar 

  • Shneiderman, B. (1998). Designing the user interface: strategies for effective human-computer interaction (3rd ed.). Reading, MA, USA: Addison Wesley.

    Google Scholar 

  • Spyridakis, J.H., Wei, C., Barrick, J., Cuddihy, E., & Maust, B. (2005). Internet-based research: providing a foundation for web-design guidelines. IEEE Transactions on Professional Communication, 49(3), 242–260.

    Article  Google Scholar 

  • Suchman, L. (1983). Office procedure as practical action: models of work and system design. ACM Transactions on Office Information Systems 1(4), 320–328.

    Article  Google Scholar 

  • Suchman, L. (1996). Constituting shared workspaces. In Y. Engestrom & D. Middleton (Eds.), Cognition and Communication at Work. Cambridge, UK: Cambridge University Press.

    Google Scholar 

  • Tudor, L., Muller, M., & Dayton, T. (1993, April). A C.A.R.D. game for participatory task analysis and redesign: macroscopic complement to PICTIVE. Proceedings from the INTERCHI 93 Conference on Human Factors in Computing Systems: Bridges Between Worlds. Amsterdam, The Netherlands.

    Google Scholar 

  • Virzi, R. (1992). Refining the test phase of usability evaluation: how many subjects is enough? Human Factors: The Journal of the Human Factors and Ergonomics Society, 34(4), 457–468.

    Google Scholar 

  • Wei, C., Barrick, J., Cuddihy, E., & Spyridakis, J. (2005, June). Conducting usability research through the internet: testing users via the WWW. Proceedings from UPA 2005: Bridging Cultures. Montreal, Quebec, Canada.

    Google Scholar 

  • Weiser, M. (1994). The world is not a desktop. Interactions 1(1), 7–8.

    Article  Google Scholar 

  • Wixon, D. & Ramey, J. (1996). Field methods casebook for software design. New York, NY, USA: John Wiley & Sons, Inc.

    Google Scholar 

  • Wood, L. (1996). The ethnographic interview in user-centered work/task analysis. In D. Wixon & J. Ramey (Eds.), Field Methods Casebook for Software Design (pp. 35–56). New York, NY, USA: John Wiley & Sons, Inc.

    Google Scholar 

  • Wynn, E. (1979). Office conversation as an information medium (unpublished PhD thesis, University of California, 1979).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag London Limited

About this chapter

Cite this chapter

Rosenbaum, S. (2008). The Future of Usability Evaluation: Increasing Impact on Value. In: Law, E.LC., Hvannberg, E.T., Cockton, G. (eds) Maturing Usability. Human-Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-84628-941-5_15

Download citation

  • DOI: https://doi.org/10.1007/978-1-84628-941-5_15

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84628-940-8

  • Online ISBN: 978-1-84628-941-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics