Usability Reporting with UsabML

  • Johannes Feiner
  • Keith Andrews
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7623)


Usability practitioners conduct formative evaluations, such as heuristic evaluations and thinking aloud tests, to identify potential problems in a user interface as part of the iterative design cycle. The findings of a formative evaluation (in essence, a list of potential problems) are usually compiled into written reports and typically delivered as a PDF or Word document. A written report is convenient for reading, but makes it difficult to reuse the findings electronically. The usability markup language (UsabML) defines a structured reporting format for the results of usability evaluations. In agile software development the direct handover of usability findings to software engineers can speed up development cycles and improve software quality.

Usability managers can now enter the findings of formative evaluations into a new, web-based system called Usability Reporting Manager (URM). Findings can be exported in UsabML format, which in turn can easily be imported by software engineers into an issue-tracking system connected to a source code repository. UsabML can also be transformed into other formats such as HTML and PDF via stylesheets (XSL).


formative evaluation usability findings exchange XML reporting format 


  1. 1.
    Molich, R., Chattratichart, J., Hinkle, V., Jensen, J.J., Kirakowski, J., Sauro, J., Sharon, T., Traynor, B.: Rent a Car in Just 0, 60, 240 or 1,217 Seconds? — Comparative Usability Measurement, CUE-8. Journal of Usability Studies 6(1), 8–24 (2010), Google Scholar
  2. 2.
    NIST: Common Industry Format (CIF) IUSR Formative Project. National Institute of Standards and Technology (2010),
  3. 3.
    ISO: SO/IEC 25062:2006 Software Engineering – Software Product Quality Requirements and Evaluation (SQuaRE) – Common Industry Format (CIF) for Usability Test Reports. International Organization for Standardization (2006),
  4. 4.
    Feiner, J., Andrews, K.: UsabML: The Usability Markup Language (2010),
  5. 5.
    Nielsen, J.: Ten Usability Heuristics (1994),
  6. 6.
    Feiner, J., Andrews, K.: github - Usability Reporting Manager (2012),
  7. 7.
    Feiner, J., Andrews, K.: URM - Usability Reporting Manager (2012),
  8. 8.
    Molich, R.: CUE - Comparative Usability Evaluation (2012),
  9. 9.
    Nielsen, J.: Usability Engineering. Morgan Kaufmann (1993) ISBN 0125184069Google Scholar
  10. 10.
    Komiyama, T.: Usability Evaluation Based on International Standards for Software Quality Evaluation. Technical Report 2, NEC (2008),
  11. 11.
    Law, E.L.C., Hvannberg, E., Cockton, G.: Maturing Usability: Quality in Software, Interaction and Value. Springer (2007) ISBN 1846289408Google Scholar
  12. 12.
    Barnum, C.M.: Usability Testing Essentials: Ready, Set...test. Morgan Kaufmann (2010), ISBN 012375092X,
  13. 13.
    Krug, S.: Don’t Make Me Think!: A Common Sense Approach to Web Usability, 2nd edn. New Riders (2005) ISBN 0321344758,
  14. 14.
    Rubin, J.B., Chisnell, D.: Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests, 2nd edn. John Wiley & Sons (2008) ISBN 0470185481Google Scholar
  15. 15.
    Stone, D.L., Jarrett, C., Woodroffe, M., Minocha, S.: User Interface Design And Evaluation. Morgan Kaufmann (2005) ISBN 0120884364Google Scholar
  16. 16.
    Howarth, J., Smith-Jackson, T., Hartson, R.: Supporting Novice Usability Practitioners With Usability Engineering Tools. Int. J. Hum.-Comput. Stud. 67(6), 533–549 (2009), doi:10.1016/j.ijhcs.2009.02.003CrossRefGoogle Scholar
  17. 17.
    Andrews, K.: Evaluation Comes in Many Guises. In: CHI 2008 Workshop on BEyond Time and Errors: Novel evaLuation Methods for Information Visualization, BELIV 2008 (2008),
  18. 18.
    NIST: Common Industry Format for Usability Test Reports. National Institute of Standards and Technology (1999),
  19. 19.
    Theofanos, M., Quesenbery, W.: Towards the Design of Effective Formative Test Reports. Journal of Usability Studies 1(1), 28–45 (2005), Google Scholar
  20. 20.
    Cockton, G., Woolrych, A., Hindmarch, M.: Reconditioned Merchandise: Extended Structured Report Formats in Usability Inspection. Extended Abstracts on Human Factors in Computing Systems (CHI 2004), pp. 1433–1436. ACM (2004), ISBN 1581137036, doi:10.1145/985921.986083Google Scholar
  21. 21.
    Feiner, J., Andrews, K., Krajnc, E.: UsabML - The Usability Report Markup Language. In: Proc. 2nd ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS 2010), pp. 297–302. ACM (2010) ISBN 1450300839, doi:10.1145/1822018.1822065Google Scholar
  22. 22.
    Loitzl, M.: The Heuristic Evaluation Manager (HEM). Master’s Thesis, Institute for Information Systems and Computer Media (IICM), Graz University of Technology (2006),
  23. 23.
    Wilson, C.E., Coyne, K.P.: The Whiteboard: Tracking Usability Issues: To Bug Or Not To Bug? Interactions 8(3), 15–19 (2001), doi:10.1145/369825.369828CrossRefGoogle Scholar
  24. 24.
    Spacco, J., Hovemeyer, D., Pugh, W.: Tracking Defect Warnings Across Versions. In: Proc. International Workshop on Mining Software Repositories (MSR 2006), pp. 133–136. ACM (2006) ISBN 1595933972, doi:10.1145/1137983.1138014Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Johannes Feiner
    • 1
  • Keith Andrews
    • 2
  1. 1.FH JOANNEUM, Internet TechnologyKapfenbergAustria
  2. 2.Graz University of TechnologyGrazAustria

Personalised recommendations