AWPS – An Architecture for Pro-active Web Performance Management

  • Gabriele Kotsis
  • Martin Pinzger
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6821)


The growing demand for quality and performance has become a discriminating factor in the field of software applications. Specifically in the area of web applications, performance has become a key factor for success creating the need for new types of performance evaluation models and methods capable of representing the dynamic characteristics of web environments.

In this paper we will recall seminal work in this area and present AWPS, a tool for automatic web performance simulation and prediction. AWPS is capable of (a) automatically creating a web performance simulation and (b) conducting trend analysis of the system under test. The operation and usage of this tool is demonstrated in a case study of a two-tier architecture system.


Web Performance Simulation Automation 


  1. 1.
    Calzarossa, M., Italiani, M., Serazzi, G.: A Workload Model Representative of Static and Dynamic Characteristics. Acta Informatica 23, 255–266 (1986)CrossRefzbMATHGoogle Scholar
  2. 2.
    Dulz, Q., Hofman, S.: Grammar-based workload modeling of communication systems. In: Proc. of Intl. Conference on Modeling Techniques and Tools For Computer Performance Evaluation, Tunis (1991)Google Scholar
  3. 3.
    Calzarossa, M., Merlo, A.P., Tessera, D., Haring, G., Kotsis, G.: A hierarchical approach to workload characterization for parallel systems. In: HPCN Europe, pp. 102–109 (1995)Google Scholar
  4. 4.
    Ferrari, D.: Workload Characterization and Selection in Computer Performance Measurement. Computer 5(4), 18–24 (1972)CrossRefGoogle Scholar
  5. 5.
    Haring, G.: On Stochastic Models of Interactive Workloads. In: Agrawala, A.K., Tripathi, S.K. (eds.) PERFORMANCE 1983, pp. 133–152. North-Holland (1983)Google Scholar
  6. 6.
    Haring, G., Ferscha, A.: Performance oriented development of parallel software with capse. In: Proceedings of the 2nd Workshop on Environments and Tools for Parallel Scientific Computing. SIAM (1994)Google Scholar
  7. 7.
    Haring, G., Kotsis, G.: Workload modeling for parallel processing systems. In: Dowd, P., Gelenbe, E. (eds.) Proc. of the 3rd Int. Workshop on Modeling, Analysis and Simulation of Computer and Telecommunication Systems, MASCOTS 1995, pp. 8–12. IEEE Computer Society Press (1995) ISBN 0-8186-6902-0 (invited Paper)Google Scholar
  8. 8.
    Haring, G., Lindemann, C., Reiser, M. (eds.): Performance Evaluation of Computer Systems and Communication Networks. Dagstuhl-Seminar-Report, No 189 (1997)Google Scholar
  9. 9.
    Hou, J.: J-sim official (January 2005),
  10. 10.
    Israr, T.A., Lau, D.H., Franks, G., Woodside, M.: Automatic generation of layered queuing software performance models from commonly available traces. In: WOSP 2005: Proceedings of the 5th International Workshop on Software and Performance, pp. 147–158. ACM, New York (2005)CrossRefGoogle Scholar
  11. 11.
    Jain, R.: The Art of Computer Systems Performance Analysis, pp. 93–110. John Wiley and Sons, Inc. (1991)Google Scholar
  12. 12.
    Kotsis, G.: Performance Management in Dynamic Computing Environments. In: Calzarossa, M.C., Gelenbe, E. (eds.) MASCOTS 2003. LNCS, vol. 2965, pp. 254–264. Springer, Heidelberg (2004) ISBN 3-540-21945-5CrossRefGoogle Scholar
  13. 13.
    Kotsis, G., Krithivasan, K., Raghavan, S.V.: Generative workload models of internet traffic. In: Proceedings of the ICICS Conference, pp. 152–156. IEEE, Singapore (1997)Google Scholar
  14. 14.
    Mos, A., Murphy, J.: A framework for performance monitoring, modelling and prediction of component oriented distributed systems. In: WOSP 2002: Proceedings of the 3rd International Workshop on Software and Performance, pp. 235–236. ACM, New York (2002)CrossRefGoogle Scholar
  15. 15.
    Pinzger, M.: Automated web performance analysis. In: ASE, pp. 513–516. IEEE (2008)Google Scholar
  16. 16.
    Pinzger, M.: Strategies for automated performance simulation model adjustment, preliminary results. In: Workshop on Monitoring, Adaptation and Beyond (MONA+) at the ServiceWave 2008 Conference. Universität Duisburg-Essen, ICB-Research Report No. 34 (2009)Google Scholar
  17. 17.
    Raghavan, S.V., Vasuki Ammaiyar, D., Haring, G.: Generative networkload models for a single server environment. In: SIGMETRICS, pp. 118–127 (1994)Google Scholar
  18. 18.
    Raghavan, S.V., Vasuki Ammaiyar, D., Haring, G.: Hierarchical approach to building generative networkload models. Computer Networks and ISDN Systems 27(7), 1193–1206 (1995)CrossRefGoogle Scholar
  19. 19.
    Sancho, P.P., Juiz, C., Puigjaner, R.: Automatic performance evaluation and feedback for mascot designs. In: WOSP 2005: Proceedings of the 5th International Workshop on Software and Performance, pp. 193–194. ACM, New York (2005)CrossRefGoogle Scholar
  20. 20.
    Microsoft AppCenter Service Team. Microsoft web application stress 1.1 (1999),
  21. 21.
    Zhang, L., Liu, Z., Riabov, A.V., Schulman, M., Xia, C.H., Zhang, F.: A Comprehensive Toolset for Workload Characterization, Performance Modeling, and Online control. In: Kemper, P., Sanders, W.H. (eds.) TOOLS 2003. LNCS, vol. 2794, pp. 63–77. Springer, Heidelberg (2003)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Gabriele Kotsis
    • 1
  • Martin Pinzger
    • 1
  1. 1.Department of TelecooperationJohannes Kepler UniversityLinzAustria

Personalised recommendations