Advertisement

Model Interoperability for Performance Engineering: Survey of Milestones and Evolution

  • Connie U. Smith
  • Catalina M. Lladó
Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6821)

Abstract

Next generation Software Performance Engineering tools will exploit a model interoperability paradigm that uses the performance modeling tool best suited to the software/hardware architecture issues and the life cycle stage of the assessment. The paradigm allows the use of existing tools to the extent possible without requiring extensive changes to them. The performance model solution should be transparent to the user. Significant milestones have been accomplished in the evolution of this paradigm. This paper covers key results in the areas of Model Interchange Formats, model transformations, tools, specification of experiments and results, and extensions for real-time and component-based systems. It then offers conclusions on next steps and the future of the model interoperability paradigm.

Keywords

IEEE Computer Society Eclipse Modeling Framework Schema Extension Interchange Format Queue Network Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Platform Independent Petri net Editor 2, http://pipe2.sourceforge.net/
  2. 2.
  3. 3.
  4. 4.
    Electronics Industries Association. CDIF - CASE Data Interchange Format Overview, EIA/IS-106 (1994)Google Scholar
  5. 5.
    Balsamo, S., Marzolla, M.: Performance evaluation of UML software architectures with multiclass queueing network models. In: Proc. 5th International Workshop of Software and Performance, Palma de Mallorca, Spain. ACM (July 2005)Google Scholar
  6. 6.
    Becker, S., Koziolek, H., Reussner, R.: Model-based performance prediction with the palladio component model. In: WOSP 2007. ACM (February 2007)Google Scholar
  7. 7.
    Beilner, H., Mäter, Weißenberg, N.: Towards a performance modeling environment: News on HIT. In: Proc. 4th Int. Conference on Modeling Techniques and Tools for Computer Performance Evaluation. Plenum Publishing (1988)Google Scholar
  8. 8.
    Cortellessa, V.: How far are we from the definition of a common software performance ontology? In: WOSP 2005. ACM (2005)Google Scholar
  9. 9.
    Courtney, T., Gaonkar, S., McQuinn, G., Rozier, E., Sanders, W., Webster, P.: Design of experiments within the möbius modeling environment. In: Proc. of the Fourth International Conference on the Quantitative Evaluation of Systems, Edingurh, UK, September 16-19, pp. 161–162. IEEE Computer Society Press (2007)Google Scholar
  10. 10.
    D’Ambrogio, A.: A model transformation framework for the automated building of performance models from UML models. In: WOSP 2005, pp. 75–86 (July 2005)Google Scholar
  11. 11.
    Woodside, C.M., et al.: Performance by unified model analysis (PUMA). In: WOSP 2005, pp. 1–12 (July 2005)Google Scholar
  12. 12.
    Savino, N., et al.: Extending UML to manage performance models for software architectures: A queuing network approach. In: Proc. 9th Int. Symp. on Modeling, Analysis and Simulation of Computer and Telecommunication Systems, SPECTS (2002)Google Scholar
  13. 13.
    Moreno, G.A., Smith, C.U., Williams, L.G.: Performance analysis of real-time component architectures: A model interchange approach. In: WOSP 2008. ACM Press (June 2008)Google Scholar
  14. 14.
    García, D., Lladó, C.M., Smith, C.U., Puigjaner, R.: Performance model interchange format: Semantic validation. In: International Conference on Software Engineering Advances. INRIA (October 2006)Google Scholar
  15. 15.
    Grassi, V., Mirandola, R., Sabetta, A.: From design to analysis models: A kernel language for performance and reliability analysis of component-based systems. In: Proc. WOSP, pp. 25–36 (July 2005)Google Scholar
  16. 16.
    Grassi, V., Mirandola, R., Sabetta, A.: A model-driven approach to performability analysis of dynamically reconfigurable component-based systems. In: Proc. WOSP. ACM (February 2007)Google Scholar
  17. 17.
    Grummitt, A.: A performance engineer’s view of systems development and trials. In: Proceedings Computer Measurement Group, pp. 455–463 (1991)Google Scholar
  18. 18.
    Gu, G., Petriu, D.: From UML to LQN by XML algebra-based model transformations. In: WOSP 2005. ACM (2005)Google Scholar
  19. 19.
    Gu, G., Petriu, D.C.: XSLT transformation from UML models to LQN performance models. In: WOSP 2002, pp. 227–234 (2002)Google Scholar
  20. 20.
    Harrison, P., Lladó, C.M., Puigjaner, R.: A general performance model interchange format. In: Proc. of the First International Conference on Performance Evaluation Methodologies and Tools, Valuetools (2006)Google Scholar
  21. 21.
    Hillston, J.: A tool to enhance model exploitation. Performance Evaluation 22(1), 59–74 (1995)CrossRefGoogle Scholar
  22. 22.
    Himmelspach, J., Rhl, M., Uhrmacher, A.M.: Component-based models and simulations for supporting valid multi-agent system simulations. Applied Artificial Intelligence 24(5), 414–442 (2010)CrossRefGoogle Scholar
  23. 23.
    López-Grao, J.P., Merseguer, J., Campos, J.: From UML activity diagrams to stochastic Petri nets: Application to software performance engineering. In: WOSP 2004. ACM (2004)Google Scholar
  24. 24.
    Norman, G., Kwiatkowska, M., Parker, D.: Prism: Probabilistic model checking for performance and reliability analysis. ACM SIGMETRICS Performance Evaluation Review 36(4), 40–45 (2009)CrossRefGoogle Scholar
  25. 25.
    Marzolla, M., Balsamo, S.: UML-PSI: the UML performance simulator (tool paper). In: Proc. 1st Int. Conf. on Quantitative Evaluation of Systems (QEST). IEEE Computer Society (2004)Google Scholar
  26. 26.
    Melià, M., Lladó, C.M., Smith, C.U., Puigjaner, R.: An experimental framework for PIPEv2.5. In: 5th Int. Conference on Quantitative Evaluation of Systems, St Malo, France, pp. 239–240. IEEE Computer Society Press (September 2008)Google Scholar
  27. 27.
    Moreno, G.A., Smith, C.U.: Performance analysis of real-time component architectures: An enhanced model interchange approach. Performance Evaluation 67, 612–633 (2010)CrossRefGoogle Scholar
  28. 28.
    Potier, D., Veran, M.: QNAP2: A portable environment for queueing systems modelling. In: Potier, D. (ed.) First International Conference on Modeling Techniques and Tools for Performance Analysis, pp. 25–63. North Holland (May 1985)Google Scholar
  29. 29.
    The VINT Project. The ns Manual. UC Berkeley, LBL, USC/ISI, and Xerox PARC (2010)Google Scholar
  30. 30.
    Rossello, J., Lladó, C.M., Puigjaner, R., Smith, C.U.: A web service for solving queueing networks models using PMIF. In: Proc. WOSP 2005, Palma de Mallorca, Spain, pp. 187–192. ACM (July 2005)Google Scholar
  31. 31.
    SEAlab Software Quality Group. WEASEL, a web service for analyzing queueing networks with multiple solvers, http://sealabtools.di.univaq.it/Weasel/
  32. 32.
    Smith, C.U., Cortellessa, V., Di Marco, A., Lladó, C.M., Williams, L.G.: From UML models to software performance results: An SPE process based on XML interchange formats. In: Proc. of the Fifth International Workshop of Software and Performance, Palma de Mallorca, Spain, July 12-14, pp. 87–98. ACM (2005)Google Scholar
  33. 33.
    Smith, C.U., Lladó, C.M., Puigjaner, R.: Automatic Generation of Performance Analysis Results: Requirements and Demonstration. In: Bradley, J.T. (ed.) EPEW 2009. LNCS, vol. 5652, pp. 73–78. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  34. 34.
    Smith, C.U., Lladó, C.M., Puigjaner, R.: Performance Model Interchange Format (PMIF 2): A comprehensive approach to queueing network model interoperability. Performance Evaluation 67(7), 548–568 (2010)CrossRefGoogle Scholar
  35. 35.
    Smith, C.U., Lladó, C.M.: Performance model interchange format (PMIF 2.0): XML definition and implementation. In: Proc. of the First International Conference on the Quantitative Evaluation of Systems, Enschede, The Netherlands, pp. 38–47. IEEE Computer Society Press (September 2004)Google Scholar
  36. 36.
    Smith, C.U., Lladó, C.M., Puigjaner, R., Williams, L.G.: Interchange formats for performance models: Experimentation and output. In: Proc. of the Fourth International Conference on the Quantitative Evaluation of Systems, Edingurh, UK, September 16-19, pp. 91–100. IEEE Computer Society Press (2007)Google Scholar
  37. 37.
    Smith, C.U., Williams, L.G.: Panel presentation: A performance model interchange format. In: Proc. of the International Conference on Modeling Techniques and Tools for Computer Performance Evaluation, Heidelberg, Germany, September 20-22. Springer, Berlin (1995)Google Scholar
  38. 38.
    Smith, C.U., Williams, L.G.: A performance model interchange format. Journal of Systems and Software 49(1), 63–80 (1999)CrossRefGoogle Scholar
  39. 39.
    Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley (2002)Google Scholar
  40. 40.
    SPE-ED. LS Computer Technology Inc., www.spe-ed.com
  41. 41.
    Woodside, C.M., Franks, G., Petriu, D.C.: The future of software performance engineering. In: International Conference on Software Engineering (ICSE), pp. 171–187. IEEE Computer Society (May 2007)Google Scholar
  42. 42.
    Wu, X., Woodside, C.M.: Performance modeling from software components. In: WOSP 2004, pp. 290–301 (January 2004)Google Scholar
  43. 43.
    Zeigler, B.P., Praehofer, H., Kim, T.G.: Theory of Modeling and Simulation: Integrating Discrete Event and Continuous Complex Dynamic Systems, 2nd edn. Academic Press (2000)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Connie U. Smith
    • 1
  • Catalina M. Lladó
    • 2
  1. 1.Performance Engineering ServicesSanta FeUSA
  2. 2.Departament de Ciències Matemàtiques i InformàticaUniversitat de les Illes BalearsPalma de MallorcaSpain

Personalised recommendations