Skip to main content

Remote Usability Evaluation: Discussion of a General Framework and Experiences from Research with a Specific Tool

  • Chapter
Maturing Usability

Part of the book series: Human-Computer Interaction Series ((HCIS))

Abstract

The goal of this chapter is to present a design space for tools and methods supporting remote usability evaluation of interactive applications. This type of approach is acquiring increasing importance because it allows usability evaluation even when users are in their daily environments. Several techniques have been developed in this area for addressing various types of applications that can be used in different contexts. We discuss them within a unifying framework that can be used to compare the weaknesses and strengths of the various approaches and identify areas that require further research work to exploit all the possibilities opened up by remote evaluation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Avouris N., Komis V., Margaritis M., & Fiotakis G., (2004), An environment for studying collaborative learning activities, Journal of Educational Technology & Society, Special Issue on Technology – Enhanced Learning, 7 (2), 34–41.

    Google Scholar 

  • Cacioppo, J.T., Berntson, G.G., Larsen, J.T., Poehlmann, K.M., & Ito, T.A. (2000). The psychophysiology of emotion. In M. Lewis & R.J.M. Haviland-Jones (Eds.), The Handbook of Emotions (2nd Ed.) (pp. 173–191). New York: Guilford Press.

    Google Scholar 

  • Card, S., Pirolli, P., Van der Wege, M., Morrison, J., Reeder, R., Schraedley, P., & Boshart, J. (2001). Information scent as a driver of Web behavior graphs: Results of a protocol analysis method for Web usability, Proceedings ACM CHI 2001 (pp. 498–504).

    Google Scholar 

  • Choo, C. W., Detlor, B., & Turnbull, D. (1998). A behavioral model of information seeking on the Web—preliminary results of a study of how managers and IT specialists use the Web. In Proceedings of the 61st ASIS Annual Meeting, 35, (pp. 290–302).

    Google Scholar 

  • ClickFox (2002), ClickFox Inc., http://www.clickfox.com.

    Google Scholar 

  • Davison, B. (1999). Web traffic logs: An imperfect resource for evaluation. In: Proceedings of Ninth Annual Conference of the Internet Society (INET’99), San Jose, CA, June 1999.

    Google Scholar 

  • Denis, C., & Karsenty, L. (2003). Inter-usability of multi-device systems: A conceptual framework. In A. Seffah & H. Javahery (Eds.), Multiple User Interfaces: Engineering and Application Framework (pp. 375–385). New Jersey: John Wiley and Sons.

    Google Scholar 

  • Etgen, M., & Cantor, J. (1999). What does getting WET (Web Event-Logging Tool) mean for Web usability? In Proceedings of the Fifth Conference on Human Factors and the Web. Gaithersburg, MD, June.

    Google Scholar 

  • Hartson, R. H., Castillo, J. C., Kelso, J. T., & Neale W. C. (1996). Remote evaluation: The network as an extension of the usability laboratory. In: Proceedings of CHI 1996 (pp. 228–235).

    Google Scholar 

  • Ho, S.Y. (2005). An exploratory study of using a user remote tracker to examine Web users’ personality traits. In Proceedings of the 7th International Conference on Electronic commerce, ICEC’05 (pp. 659–665), August 15–17, 2005, Xi’an, China. ACM Press.

    Google Scholar 

  • Hong, J.I., & Landay, J.A. (2001). WebQuilt: a framework for capturing and visualizing the Web experience. In Proceedings of WWW 2001 Conference (pp. 717–724).

    Google Scholar 

  • IQ Services (2006). IQ Services: Interactive Quality Services, Inc. Access at: http://www.iq-services.com/

    Google Scholar 

  • Ivory M. Y., & Hearst M. A., (2001) The state of the art in automating usability evaluation of user interfaces. ACM Computing Surveys, 33(4), 470–516.

    Article  Google Scholar 

  • Lecerof, A., & PaternĂ³, F. (1998), Automatic support for usability evaluation. IEEE Transactions on Software Engineering, 24 (10), 863–888.

    Article  Google Scholar 

  • Lister M. (2003). Streaming format software for usability testing. In Proceedings ACM CHI 2003, Extended Abstracts (pp. 632–633).

    Google Scholar 

  • Paganelli, L., & PaternĂ³ F. (2003). Tools for remote usability evaluation of Web applications through browser logs and task models. Behavior Research Methods, Instruments, and Computers, 35 (3), 369–378.

    Article  Google Scholar 

  • PaternĂ², F., & Ballardin, G. (2000). RemUSINE: a bridge between empirical and model-based evaluation when evaluators and users are distant. Interacting with Computers, 13(2), 229–251.

    Article  Google Scholar 

  • PaternĂ², F., Piruzza, A., & Santoro, C. (2006). Remote Web usability evaluation exploiting multi-modal information on user behavior. In Proceedings CADUI 2006, Bucharest. Springer-Verlag.

    Google Scholar 

  • Petrie, H., Hamilton, F., King, N., & Pavan, P. (2006). Remote usability evaluations with disabled people. In Proceedings of CHI 2006 (pp. 1131–1141), MontrĂ©al, QuĂ©bec, Canada, April 22–27, 2006.

    Google Scholar 

  • Scholtz, J., Laskowski, S., & Downey L. (1998) Developing usability tools and techniques for designing and testing Websites. In Proceedings HFWeb’98 (Basking Ridge, NJ, June 1998). Access at: http://www.research.att.com/conf/hfWeb/proceedings/scholtz/index.html

    Google Scholar 

  • Stoica, A., Fiotakis, G., Simarro-Cabrera, J., Frutos, H.M., Avouris, N., & Dimitriadis, Y. (2005). Usability evaluation of handheld devices: A case study for a museum application. In Proceedings of PCI 2005, Volos, November 2005.

    Google Scholar 

  • Tauscher, L.M. (1999). Evaluating history mechanisms: An empirical study of reuse patterns in WWW navigation. MS Thesis, Department of Computer Science, University of Calgary, Alberta, Canada.

    Google Scholar 

  • Tennent, P., Chalmers, M., & Morrison, A. (2006). Replayer: Collaborative evaluation of mobile applications. Presented in CHI’06 Workshop on Information Visualization and Interaction Techniques for Collaboration across Multiple Displays, Montreal, Canada.

    Google Scholar 

  • Tullis, T, Fleischman, S., McNulty, M, Cianchette, C., & Bergel, M. (2002). An empirical comparison of lab and remote usability testing of websites. In: Proceedings of Usability Professionals Conference, Pennsylvania, 2002.

    Google Scholar 

  • Waterson, S., Landay, J.A., & Matthews, T. (2002). In the lab and out in the wild: Remote Web usability testing for mobile devices. In: Proceedings of CHI 2002 (pp. 796–797), April 20–25, Minneapolis, USA.

    Google Scholar 

  • West, R., & Lehman, K.R. (2006). Automated summative usability studies: An empirical evaluation. In Proceedings of CHI 2006(pp. 631–639), April 22–27, 2006, MontrĂ©al, QuĂ©bec, Canada, ACM Press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag London Limited

About this chapter

Cite this chapter

PaternĂ², F., Santoro, C. (2008). Remote Usability Evaluation: Discussion of a General Framework and Experiences from Research with a Specific Tool. In: Law, E.LC., Hvannberg, E.T., Cockton, G. (eds) Maturing Usability. Human-Computer Interaction Series. Springer, London. https://doi.org/10.1007/978-1-84628-941-5_9

Download citation

  • DOI: https://doi.org/10.1007/978-1-84628-941-5_9

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-84628-940-8

  • Online ISBN: 978-1-84628-941-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics