User Study Techniques in the Design and Evaluation of a Ubicomp Environment

  • Sunny Consolvo
  • Larry Arnstein
  • B. Robert Franza
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2498)


To be successful, ubicomp applications must be designed with their environment and users in mind and evaluated to confirm that they do not disrupt the users’ natural workflow. Well-established techniques for understanding users and their environment exist, but are not specifically designed to assess how well the computing and physical task environments blend. We present strengths and weaknesses of several qualitative and quantitative user study techniques for ubicomp. We applied these techniques to the design and evaluation of a ubicomp application for cell biology laboratories (Labscape). We describe how these techniques helped identify design considerations that were crucial for Labscape’s adoption and demonstrate their ability to measure how effectively applications blend into an environment.


Cognitive Load Ubiquitous Computing Contrived Task Representative User Authentic Setting 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Abowd, G.D.: “Classroom 2000: An experiment with the instrumentation of a living educational environment.” IBM Systems Journal, Vol. 38 (1999)Google Scholar
  2. 2.
    Abowd, G.D., Mynatt, E.D.: “Charting Past, Present, and Future Research in Ubiquitous Computing.” ACM Transactions on Computer-Human Interaction, Vol. 7. (Mar 2000) 29–58Google Scholar
  3. 3.
    Abowd, G.D., Mynatt, E.D., Rodden, T.: “The Human Experience.” IEEE Pervasive Computing, Vol. 1 (Jan-Mar 2002) 48–57Google Scholar
  4. 4.
    Arnstein, L.F., Borriello, G., Consolvo, S., Franza, B.R., Hung, C., Su, J., Zhou, Q.H.: “Labscape: Design of a Smart Environment for the Cell Biology Laboratory.” To appear in IEEE Pervasive Computing Google Scholar
  5. 5.
    Arnstein, L.F., Grimm, R., Hung, C., Kang, J.H., LaMarca, A., Sigurdsson, S., Su, J., Borriello, G.: “Systems Support for Ubiquitous Computing: A Case Study of two Implementations of Labscape.” Proceedings of the International Conference on Pervasive Computing, Zurich, Springer Verlag (2002)Google Scholar
  6. 6.
    Burrell, J., Gay, G.: “E-graffiti: Evaluating Real-world Use of a Context-aware System.” Interacting with Computers, Article 1228 (2002)Google Scholar
  7. 7.
    Cheverst, K., Davies, N., Mitchell, K., Friday, A.: “Experiences of Developing and Deploying a Context-Aware Tourist Guide: The GUIDE Project.” Proceedings of the 6 th Annual Conference on Mobile Computing and Networking, Boston (2000)Google Scholar
  8. 8.
    Cohen, J. “A coefficient of agreement for nominal scales.” Educational and Psychological Measurement, Vol. 20 (1960) 37–46CrossRefGoogle Scholar
  9. 9.
    Hackos, J.T., Redish, J.C.: User and Task Analysis for Interface Design. John Wiley & Sons, Inc., New York Chichester Weinheim Brisbane Singapore Toronto (1998)Google Scholar
  10. 10.
    Hilbert, D.M., Redmiles, D.F.: “Extracting Usability Information from User Interface Events.” ACM Computing Surveys (CSUR), Vol. 32, Issue 4. (Dec 2000)Google Scholar
  11. 11.
    Lysakowski: “Comparing Paper and Electronic Laboratory Notebooks, Parts I and II.” Scientific Computing and Automation Magazine (March 1997 & May 1997)Google Scholar
  12. 12.
    Moran, T.P., Palen, L., Harrison, S., Chiu, P., Kimber, D., Minneman, S., van Melle, W., Zellweger, P.: “’I’ll Get That Off the Audio’: A Case Study of Salvaging Multimedia Meeting Records.” CHI Conference Proceedings (1997)Google Scholar
  13. 13.
    Myers, J.D., Fox-Dobbs, C., Laird, J., Le, D., Reich, D., Curtz, T.: “Electronic Laboratory Notebooks for Collaborative Research.” Proceedings of IEEE WET ICE, Stanford, CA (1996)Google Scholar
  14. 14.
    Norman, D.A.: The Design of Everyday Things. Currency and Doubleday, New York London Toronto Sydney Auckland (1988) 12–17Google Scholar
  15. 15.
    Osofsky, J.D. (ed.): Handbook of Infant Development. John Wiley & Sons, Inc., New York Chichester Brisbane Toronto (1979)Google Scholar
  16. 16.
    Rubin, J.: Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. John Wiley & Sons, Inc., New York Chichester Brisbane Toronto Singapore (1994)Google Scholar
  17. 17.
    Sackett, G.P. (ed.): Observing Behavior, Vol. II: Data Collection and Analysis Methods. University Park Press, Baltimore London Tokyo (1978)Google Scholar
  18. 18.
    Sanderson, P.M., Scott, J.J.P., Johnston, T., Mainzer, J., Watanabe, L.M., James, J.M.: “MacSHAPA and the Enterprise of Exploratory Sequential Data Analysis (ESDA).” International Journal of Human-Computer Studies, Vol. 41 (1994)Google Scholar
  19. 19.
    Scholtz, J., Herman, M., Laskowski, S., Smailagic, A., Siewiorek, D.: “Workshop on Evaluation Methodologies for Ubiquitous Computing.”, UbiComp ‘01 (October 2001)
  20. 20.
    Schutt, R.K.: Investigating the Social World. 3rd ed. Pine Forge Press, California (2001)Google Scholar
  21. 21.
    Weiser, M.: “The Computer for the 21st Century.” Scientific American (Sept 1991) 94–104Google Scholar
  22. 22.
    Weiser, M.: “Some Computer Science Issues in Ubiquitous Computing.” Communications of the ACM, Vol. 36, Issue 7 (July 1993) 75–84CrossRefGoogle Scholar
  23. 23.
    Wood, L.E.: “Semi-Structured Interviewing for User-Centered Design.” ACM Interactions, Vol. 4, Issue 2 (Mar & Apr 1997) 48–61CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Sunny Consolvo
    • 1
  • Larry Arnstein
    • 2
  • B. Robert Franza
    • 3
  1. 1.Intel Research SeattleUSA
  2. 2.Department of Computer Science & EngineeringUniversity of WashingtonUSA
  3. 3.Cell Systems Initiative, Department of BioengineeringUniversity of WashingtonUSA

Personalised recommendations