Abstract
In Part II, we discussed research efforts aimed at developing or refining automated evaluation methodologies for evaluating web interfaces. Some of these efforts have produced commercial or research software (e.g., WatchFire Bobby [WatchFire, 2002], the HTML Validation Service1 [World Wide Web Consortium, 2001c], the Web Static Analyzer Tool [Scholtz and Laskowski, 1998], and many others) that practitioners can use to evaluate web site usability, accessibility, coding, performance, and so on. In the remainder of this book (Part III), we examine the use of automated evaluation tools, and we discuss the role and the efficacy of existing tools.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Notes
We use the terms W3C HTML Validation Service, W3C HTML Validator, and W3C Validator interchangeably throughout this book.
We use the terms 508 Accessibility Suite and 508 Suite interchangeably throughout this book.
Researchers at the Center for Applied Special Technology originally developed Bobby; WatchFire acquired it in July 2002.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Ivory, M.Y. (2003). Automated Evaluation Tools. In: Automated Web Site Evaluation. Human-Computer Interaction Series, vol 4. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-0375-8_10
Download citation
DOI: https://doi.org/10.1007/978-94-017-0375-8_10
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-6446-2
Online ISBN: 978-94-017-0375-8
eBook Packages: Springer Book Archive