Advertisement

Improved Feedback for Architectural Performance Prediction Using Software Cartography Visualizations

  • Klaus Krogmann
  • Christian M. Schweda
  • Sabine Buckl
  • Michael Kuperberg
  • Anne Martens
  • Florian Matthes
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5581)

Abstract

Software performance engineering provides techniques to analyze and predict the performance (e.g., response time or resource utilization) of software systems to avoid implementations with insufficient performance. These techniques operate on models of software, often at an architectural level, to enable early, design-time predictions for evaluating design alternatives. Current software performance engineering approaches allow the prediction of performance at design time, but often provide cryptic results (e.g., lengths of queues). These prediction results can be hardly mapped back to the software architecture by humans, making it hard to derive the right design decisions. In this paper, we integrate software cartography (a map technique) with software performance engineering to overcome the limited interpretability of raw performance prediction results. Our approach is based on model transformations and a general software visualization approach. It provides an intuitive mapping of prediction results to the software architecture which simplifies design decisions. We successfully evaluated our approach in a quasi experiment involving 41 participants by comparing the correctness of performance-improving design decisions and participants’ time effort using our novel approach to an existing software performance visualization.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Balsamo, S., Marco, A.D., Inverardi, P., Simeoni, M.: Model-Based Performance Prediction in Software Development: A Survey. IEEE Trans. on SE 30(5), 295–310 (2004)CrossRefGoogle Scholar
  2. 2.
    Becker, S., Koziolek, H., Reussner, R.: The Palladio component model for model-driven performance prediction. Journal of Systems and Software 82, 3–22 (2009)CrossRefGoogle Scholar
  3. 3.
    Buckl, S., Ernst, A.M., Lankes, J., Matthes, F., Schweda, C., Wittenburg, A.: Generating visualizations of enterprise architectures using model transformation (extended version). Enterprise Modelling and Information Systems Architectures – An International Journal 2(2) (2007)Google Scholar
  4. 4.
    Compuware. Applied performance management survey (October 2006), http://www.cnetdirectintl.com/direct/compuware/Ovum_APM/APM_Survey_Report.pdf (last retrieved 2009-02-10)
  5. 5.
    de Miguel, M., Lambolais, T., Hannouz, M., Betgé-Brezetz, S., Piekarec, S.: UML extensions for the specification and evaluation of latency constraints in architectural models. In: Workshop on Software and Performance, pp. 83–88 (2000)Google Scholar
  6. 6.
    Eclipse Foundation. Graphical modeling framework homepageGoogle Scholar
  7. 7.
    Ernst, A.M., Lankes, J., Schweda, C.M., Wittenburg, A.: Using model transformation for generating visualizations from repository contents. Technical report, Technische Universität München, Munich (2006)Google Scholar
  8. 8.
    Franks, G., Hubbard, A., Majumdar, S., Neilson, J., Petriu, D., Rolia, J., Woodside, M.: A toolset for performance engineering and software design of client-server systems. Perform. Eval. 24(1-2), 117–136 (1995)CrossRefzbMATHGoogle Scholar
  9. 9.
    Gamma, E., Helm, R., Johnson, R., Vlissides, J.: Design Patterns: Elements of Reusable Object-Oriented Software. Addision-Wesley, Reading (1995)zbMATHGoogle Scholar
  10. 10.
    Hake, G., Grünreich, D., Meng, L.: Kartographie. Walter de Gruyter, Berlin (2002)Google Scholar
  11. 11.
    Heath, M.T., Etheridge, J.A.: Visualizing the performance of parallel programs. IEEE Software 8(5), 29–39 (1991)CrossRefGoogle Scholar
  12. 12.
    HyPerformix Inc. Hyperformix homepage (2007), http://www.hyperformix.com (last retrieved 2009-01-22)
  13. 13.
    IEEE. IEEE Std 1471-2000 for Recommended Practice for Architectural Description of Software-Intensive Systems (2000)Google Scholar
  14. 14.
    Kähkipuro, P.: UML-based performance modeling framework for component-based distributed systems. In: Dumke, R.R., Rautenstrauch, C., Schmietendorf, A., Scholz, A. (eds.) WOSP 2000 and GWPESD 2000. LNCS, vol. 2047, pp. 167–184. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  15. 15.
    Kounev, S.: Performance Modeling and Evaluation of Distributed Component-Based Systems Using Queueing Petri Nets. IEEE Trans. on SE 32(7), 486–502 (2006)CrossRefGoogle Scholar
  16. 16.
    Lehr, T., Segall, Z., Vrsalovic, D.F., Caplan, E., Chung, A.L., Fineman, C.E.: Visualizing performance debugging. Computer 22(10), 38–51 (1989)CrossRefGoogle Scholar
  17. 17.
    Marzolla, M.: Simulation-Based Performance Modeling of UML Software Architectures. PhD Thesis TD-2004-1, Università Ca’ Foscari di Venezia, Mestre, Italy (February 2004)Google Scholar
  18. 18.
    Matthes, F.: Softwarekartographie. Informatik Spektrum 31(6) (2008)Google Scholar
  19. 19.
    Nassar, M.: VUML: a viewpoint oriented UML extension. In: 18th IEEE International Conference on Automated Software Engineering, pp. 373–376 (October 2003)Google Scholar
  20. 20.
    Object Management Group (OMG). UML Profile for Modeling and Analysis of Real-Time and Embedded systems (MARTE) RFP (realtime/05-02-06) (2006)Google Scholar
  21. 21.
    Object Management Group (OMG). Unified Modeling Language: Superstructure Specification: Version 2.1.2, Revised Final Adopted Specification (formal/2007-11-02) (2007)Google Scholar
  22. 22.
    OMG. Meta Object Facility (MOF) Core Specification, version 2.0 (formal/06-01-01) (2006)Google Scholar
  23. 23.
    Petriu, D.C., Shen, H.: Applying the UML performance profile: Graph grammar-based derivation of LQN models from UML specifications. In: Field, T., Harrison, P.G., Bradley, J., Harder, U. (eds.) TOOLS 2002. LNCS, vol. 2324, pp. 159–177. Springer, Heidelberg (2002)Google Scholar
  24. 24.
    Runeson, P., Höst, M.: Guidelines for conducting and reporting case study research in software engineering. Empirical Software Engineering 14(2), 131–164 (2009)CrossRefGoogle Scholar
  25. 25.
    Sarukkai, S.R., Kimelman, D., Rudolph, L.: A methodology for visualizing performance of loosely synchronous programs. In: Scalable High Performance Computing Conference, 1992. SHPCC 1992. Proceedings, pp. 424–432. IEEE, Los Alamitos (1992)CrossRefGoogle Scholar
  26. 26.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley, Reading (2002)Google Scholar
  27. 27.
    Welch, B.L.: The generalization of student’s problem when several different population variances are involved. Biometrika 34, 28–35 (1947)MathSciNetzbMATHGoogle Scholar
  28. 28.
    Williams, L.G., Smith, C.U.: Making the Business Case for Software Performance Engineering. In: Proceedings of the 29th International Computer Measurement Group Conference, Dallas, Texas, USA, December 7-12, 2003, pp. 349–358. Computer Measurement Group (2003)Google Scholar
  29. 29.
    Wittenburg, A.: Softwarekartographie: Modelle und Methoden zur systematischen Visualisierung von Anwendungslandschaften. PhD thesis, Technische Universität München (2007)Google Scholar
  30. 30.
    Woodside, C.M., Hrischuk, C.E., Selic, B., Bayarov, S.: Automated performance modeling of software generated by a design environment. Perform. Eval. 45(2-3), 107–123 (2001)CrossRefzbMATHGoogle Scholar
  31. 31.
    Woodside, M., Franks, G., Petriu, D.C.: The Future of Software Performance Engineering. In: Proceedings of ICSE 2007, Future of SE, pp. 171–187. IEEE Computer Society Press, Washington (2007)Google Scholar
  32. 32.
    Yan, J., Sarukkai, S., Mehra, P.: Performance measurement, visualization and modeling of parallel and distributed programs using the aims toolkit. Softw. Pract. Exper. 25(4), 429–461 (1995)CrossRefGoogle Scholar
  33. 33.
    Zaki, O., Lusk, E., Gropp, W., Swider, D.: Toward Scalable Performance Visualization with Jumpshot. Int. J. High Perf. Comp. Appl. 13(3), 277–288 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Klaus Krogmann
    • 1
  • Christian M. Schweda
    • 2
  • Sabine Buckl
    • 2
  • Michael Kuperberg
    • 1
  • Anne Martens
    • 1
  • Florian Matthes
    • 2
  1. 1.Software Design and Quality GroupUniversität Karlsruhe (TH)Germany
  2. 2.Software Engineering for Business Information SystemsTechnische Universität MünchenGermany

Personalised recommendations