Benchmarking Models and Tools for Distributed Web-Server Systems

  • Mauro Andreolini
  • Valeria Cardellini
  • Michele Colajanni
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2459)


This tutorial reviews benchmarking tools and techniques that can be used to evaluate the performance and scalability of highly accessed Web-server systems. The focus is on design and testing of locally and geographically distributed architectures where the performance evaluation is obtained through workload generators and analyzers in a laboratory environment. The tutorial identifies the qualities and issues of existing tools with respect to the main features that characterize a benchmarking tool (workload representation, load generation, data collection, output analysis and report) and their applicability to the analysis of distributed Web-server systems.


Server Node Domain Name Server Client Request User Session Client Node 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. [1]
    M. Arlitt. Characterizing Web user sessions. ACM Performance Evaluation Review, 28(2):50–63, Sept. 2000.CrossRefGoogle Scholar
  2. [2]
    M. Arlitt, D. Krishnamurthy, and J. Rolia. Characterizing the scalability of a large Web-based shopping system. ACM Trans. on Internet Technology, 1(1):44–69, Sept. 2001.CrossRefGoogle Scholar
  3. [3]
    M. F. Arlitt and T. Jin. A workload characterization study of the 1998 World Cup Web site. IEEE Network, 14(3):30–37, May/June 2000.CrossRefGoogle Scholar
  4. [4]
    M. F. Arlitt and C. L. Williamson. Internet Web servers: Workload characterization and performance implications. IEEE/ACM Trans. on Networking, 5(5):631–645, Oct. 1997.CrossRefGoogle Scholar
  5. [5]
    H. Balakrishnan, V. Padmanabhan, S. Seshan, M. Stemm, and R. Katz. TCP behavior of a busy Internet server: Analysis and improvements. In Proc. of IEEE Infocom 1998, pages 252–262, San Francisco, CA, Mar. 1998.Google Scholar
  6. [6]
    G. Banga and P. Druschel. Measuring the capacity of a Web server under realistic loads. World Wide Web, 2(1–2):69–89, May 1999.CrossRefGoogle Scholar
  7. [7]
    P. Barford and M. E. Crovella. Generating representative Web workloads for network and server performance evaluation. In Proc. of ACM Performance 1998/Sigmetrics 1998, pages 151–160, Madison, WI, July 1998.Google Scholar
  8. [8]
    P. Barford and M. E. Crovella. A performance evaluation of Hyper Text Transfer Protocols. In Proc. of ACM Sigmetrics 1999, pages 188–197, Atlanta, May 1999.Google Scholar
  9. [9]
    P. Barford and M. E. Crovella. Critical path analysis of TCP transactions. IEEE/ACM Trans. on Networking, 9(3):238–248, June 2001.CrossRefGoogle Scholar
  10. [10]
    V. Cardellini, E. Casalicchio, M. Colajanni, and P. S. Yu. The state of the art in locally distributed Web-server systems. ACM Computing Surveys, 34(2):263–311, June 2002.CrossRefGoogle Scholar
  11. [11]
    V. Cardellini, M. Colajanni, and P. S. Yu. Geographic load balancing for scalable distributed Web systems. In Proc. of IEEE MASCOTS 2000, pages 20–27, San Francisco, CA, Aug./Sept. 2000.Google Scholar
  12. [12]
    M. E. Crovella and A. Bestavros. Self-similarity in World Wide Web traffic: Evidence and possible causes. IEEE/ACM Trans. on Networking, 5(6):835–846, Dec. 1997.CrossRefGoogle Scholar
  13. [13]
    R. T. Fielding, J. Gettys, J. C. Mogul, H. F. Frystyk, L. Masinter, P. J. Leach, and T. Berners-Lee. Hypertext Transfer Protocol-HTTP/1.1. RFC 2616, June 1999.Google Scholar
  14. [14]
    Intel Corp. Using the RDTSC instruction for performance monitoring, July 1998.
  15. [15]
    A. K. Iyengar, M. S. Squillante, and L. Zhang. Analysis and characterization of large-scale Web server access patterns and performance. World Wide Web, 2(1–2):85–100, Mar. 1999.CrossRefGoogle Scholar
  16. [16]
    R. Jain. The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling. Wiley-Interscience, 1991.Google Scholar
  17. [17]
    R. Jain and I. Chlamtac. The P-Square algorithm for dynamic calculation of percentiles and histograms without storing observations. ACM Communications, 28(10), Oct. 1985.Google Scholar
  18. [18]
    D. Kegel. The C10K problem, 2002.
  19. [19]
    B. Krishnamurthy, J. C. Mogul, and D. M. Kristol. Key differences between HTTP/1.0 and HTTP/1.1. Computer Networks, 31(11–16):1737–1751, 1999.CrossRefGoogle Scholar
  20. [20]
    B. Krishnamurthy and J. Rexford. Web Protocols and Practice: HTTP/1.1, Networking Protocols, Caching, and Traffic Measurement. Addison-Wesley, Reading, MA, 2001.Google Scholar
  21. [21]
    B. Krishnamurthy and C. E. Wills. Analyzing factors that influence end-to-end Web performance. Computer Networks, 33(1–6):17–32, 2000.CrossRefGoogle Scholar
  22. [22]
    D. Krishnamurthy and J. Rolia. Predicting the QoS of an electronic commerce server: Those mean percentiles. In Proc. of Workshop on Internet Server Performance, Madison, WI, June 1998.Google Scholar
  23. [23]
    B. Lavoie and H. F. Frystyk. Web Characterization Terminology & Definitions Sheet. W3C Working Draft, May 1999.Google Scholar
  24. [24]
    Z. Liu, N. Niclausse, and C. Jalpa-Villanueva. Traffic model and performance evaluation of Web servers. Performance Evaluation, 46(2–3):77–100, Oct. 2001.zbMATHCrossRefGoogle Scholar
  25. [25]
    S. Manley, M. Seltzer, and M. Courage. A self-scaling and self-configuring benchmark for Web servers. In Proc. of ACM Sigmetrics 1998 Conf., pages 170–171, Madison, WI, June 1998.Google Scholar
  26. [26]
    D. A. Menascé. TPC-W: A benchmark for e-commerce. IEEE Internet Computing, 6(3):83–87, May/June 2002.CrossRefGoogle Scholar
  27. [27]
    D. A. Menascé and V. A. F. Almeida. Scaling for E-business. Technologies, Models, Performance and Capacity planning. Prentice Hall, Upper Saddle River, NJ, 2000.Google Scholar
  28. [28]
    D. A. Menascé and V. A. F. Almeida. Capacity Planning for Web Services. Metrics, Models, and Methods. Prentice Hall, Upper Saddle River, NJ, 2002.Google Scholar
  29. [29]
    J. Midgley. Autobench, 2002. http://
  30. [30] Mindcraft. WebStone.
  31. [31]
  32. [32]
    D. Mosberger and T. Jin. httperf—A tool for measuring Web server performance. ACM Performance Evaluation Review, 26(3):31–37, Dec. 1998.CrossRefGoogle Scholar
  33. [33]
    E. M. Nahum, M. Rosu, S. Seshan, and J. Almeida. The effects of wide-area conditions on WWW server performance. In Proc. of ACM Sigmetrics 2001, pages 257–267, Cambridge, MA, June 2001.Google Scholar
  34. [34]
    Neal Nelson. Web Server Benchmark.
  35. [35]
    M. Rabinovich and O. Spatscheck. Web Caching and Replication. AddisonWesley, 2002.Google Scholar
  36. [36]
    L. Rizzo. Dummynet: A simple approach to the evaluation of network protocols. ACM Computer Communication Review, 27(1):31–41, Jan. 1997.CrossRefGoogle Scholar
  37. [37]
    R. Simmonds, C. Williamson, M. Arlitt, R. Bradford, and B. Unger. A case study of Web server benchmarking using parallel WAN emulation. In Proc. of IFIP Int’l Symposium Performance 2002, Roma, Italy, Sept. 2002.Google Scholar
  38. [38]
    Standard Performance Evaluation Corp. SPECweb99.
  39. [39]
    Standard Performance Evaluation Corp. SPECweb99 SSL.
  40. [40]
  41. [41]
    Transaction Processing Performance Council. TPC-W.
  42. [42]
  43. [43]
    W. Willinger, M. S. Taqqu, R. Sherman, and D. V. Wilson. Self-similarity through high-variability: Statistical analysis of Ethernet LAN traffic at the source level. IEEE/ACM Trans. on Networking, 5(1):71–86, Jan. 1997.CrossRefGoogle Scholar
  44. [44]
    C.-S. Yang and M.-Y. Luo. A content placement and management system for distributed Web-server systems. In Proceedings of the 20th IEEE International Conference on Distributed Computing Systems, pages 691–698, Taipei, Taiwan, Apr. 2000.Google Scholar
  45. [45]

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Mauro Andreolini
    • 1
  • Valeria Cardellini
    • 1
  • Michele Colajanni
    • 2
  1. 1.Dept. of Computer, Systems and ProductionUniversity of Roma “Tor Vergata”RomaItaly
  2. 2.Dept. of Information EngineeringUniversity of ModenaModenaItaly

Personalised recommendations