Software & Systems Modeling

, Volume 18, Issue 4, pp 2609–2632 | Cite as

SQME: a framework for modeling and evaluation of software architecture quality attributes

  • Ali Sedaghatbaf
  • Mohammad Abdollahi AzgomiEmail author
Regular Paper


Designing a software architecture that satisfies all quality requirements is a difficult task. To determine whether the requirements are achieved, it is necessary to quantitatively evaluate quality attributes on the architecture model. A good evaluation process should have proper answers for these questions: (1) how to feedback the evaluation results to the architecture model (i.e., improve the architecture based on the evaluation results), (2) how to analyze uncertainties in calculations, and (3) how to handle conflicts that may exist between the quality preferences of stakeholders. In this paper, we introduce SQME as a framework for automatic evaluation of software architecture models. The framework uses evolutionary algorithms for architecture improvement, evidence theory for uncertainty handling, and EV/TOPSIS for making trade-off decisions. To validate the applicability of the framework, a case study is performed, and a software tool is developed to support the evaluation process.


Software architecture Software quality attributes Evolutionary algorithms Evidence theory EV/TOPSIS 


  1. 1.
    Becker, S., Kapova, L.H.: Towards a methodology driven by relationships of quality attributes for QoS-based analysis. In: Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering, 2013, pp. 311–314Google Scholar
  2. 2.
    Martens, A., Koziolek, H., Becker, S., Reussner, R.: Automatically improve software architecture models for performance, reliability, and cost using evolutionary algorithms. In: Proceedings of the First Joint WOSP/SIPEW International Conference on Performance Engineering, 2010, pp. 105–116Google Scholar
  3. 3.
    Trubiani, C., Meedeniya, I., Cortellessa, V., Aleti, A., Grunske, L.: Model-based performance analysis of software architectures under uncertainty. In: Proceedings of the 9th International ACM Sigsoft Conference on Quality of Software Architectures, 2013, pp. 69–78Google Scholar
  4. 4.
    Autili, M., Cortellessa, V., Di Ruscio, D., Inverardi, P., Pelliccione, P., Tivoli, M.: EAGLE: engineering software in the ubiquitous globe by leveraging uncertainty. In: Proceedings of the 19th ACM SIGSOFT Symposium and the 13th European Conference on Foundations of Software Engineering, 2011, pp. 488–491Google Scholar
  5. 5.
    Xu, J.: Rule-based automatic software performance diagnosis and improvement. Perform. Eval. 69(8), 525–550 (2012)CrossRefGoogle Scholar
  6. 6.
    Esfahani, N., Malek, S., Razavi, K.: GuideArch: guiding the exploration of architectural solution space under uncertainty. In: Proceedings of the International Conference on Software Engineering, 2013, pp. 43–52Google Scholar
  7. 7.
    Dubois, D.: Possibility theory and statistical reasoning. Comput. Stat. Data Anal. 51(1), 47–69 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Shafer, G.: The Dempster–Shafer Theory. Encycl. Artif. Intell. 2nd edn., pp. 330–331 (2008)Google Scholar
  9. 9.
    Sedaghatbaf, A., Azgomi, M.A.: Quantitative evaluation of software security: an approach based on UML/SecAM and evidence theory. ISC. Int. J. Inf. Secur. 8(2), 137–149 (2016)Google Scholar
  10. 10.
    Sedaghatbaf, A., Azgomi, M.A.: Reliability evaluation of UML-DAM models under unertainty. IET Softw. (2018). Google Scholar
  11. 11.
    Group, O.M.: UML Profile for MARTE: Modeling and Analysis of Real-Time Embedded Systems v.1.0 (2009)Google Scholar
  12. 12.
    Bernardi, S., Merseguer, J., Petriu, D.C.: A dependability profile within MARTE. Softw. Syst. Model. 10(3), 313–336 (2011)CrossRefGoogle Scholar
  13. 13.
    Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)zbMATHGoogle Scholar
  14. 14.
    Petriu, D.C., Alhaj, M., Tawhid, R.: Software performance modeling. In: Proceedings of the 12th International Conference on Formal Methods for the Design of Computer, Communication, and Software Systems: Formal Methods for Model-driven Engineering, 2012, pp. 219–262Google Scholar
  15. 15.
    Distefano, S., Scarpa, M., Puliafito, A.: From UML to Petri nets: the PCM-based methodology. IEEE Trans. Softw. Eng. 37(1), 65–79 (2011)CrossRefGoogle Scholar
  16. 16.
    Brosch, F., Koziolek, H., Buhnova, B., Reussner, R.: Architecture-based reliability prediction with the palladio component model. IEEE Trans. Softw. Eng. 38(6), 1319–1339 (2012)CrossRefGoogle Scholar
  17. 17.
    Becker, S., Koziolek, H., Reussner, R.: The palladio component model for model-driven performance prediction. J. Syst. Softw. 82(1), 3–22 (2009)CrossRefGoogle Scholar
  18. 18.
    Bernardi, S., Merseguer, J., Petriu, D.C.: Dependability modeling and assessment in UML-based software development. Sci. World J. 2012, 1–11 (2012)zbMATHGoogle Scholar
  19. 19.
    Bernardi, S., Merseguer, J., Petriu, D.C.: Dependability modeling and analysis of software systems specified with UML. ACM Comput. Surv. 45(1), 1–48 (2012)CrossRefzbMATHGoogle Scholar
  20. 20.
    Räihä, O.: A survey on search-based software design. Comput. Sci. Rev. 4(4), 203–249 (2010)CrossRefGoogle Scholar
  21. 21.
    Aleti, A., Buhnova, B., Grunske, L., Koziolek, A., Meedeniya, I.: Software architecture optimization methods: a systematic literature review. IEEE Trans. Softw. Eng. 39(5), 658–683 (2013)CrossRefGoogle Scholar
  22. 22.
    Walker, M., Reiser, M.O., Tucci-Piergiovanni, S., Papadopoulos, Y., Lönn, H., Mraidha, C., Parker, D., Chen, D., Servat, D.: Automatic optimisation of system architectures using EAST-ADL. J. Syst. Softw. 86(10), 2467–2487 (2013)CrossRefGoogle Scholar
  23. 23.
    Cuenot, P., Chen, D., Gérard, S., Lönn, H., Reiser, M.O., Servat, D., Sjöstedt, C.J., Kolagari, R.T., Törngren, M., Weber, M.: Managing complexity of automotive electronics using the EAST-ADL. In: Proceedings of the IEEE International Conference on Engineering of Complex Computer Systems, ICECCS, 2007, pp. 353–358Google Scholar
  24. 24.
    Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In: Parallel Problem Solving from Nature, 2000, pp. 849–858Google Scholar
  25. 25.
    Koziolek, A., Ardagna, D., Mirandola, R.: Hybrid multi-attribute QoS optimization in component based software systems. J. Syst. Softw. 86(10), 2542–2558 (2013)CrossRefGoogle Scholar
  26. 26.
    Koziolek, A., Koziolek, H., Reussner, R.: PerOpteryx: automated application of tactics in multi-objective software architecture optimization. In: Proceedings of the Joint ACM SIGSOFT Conference and ACM SIGSOFT Symposium on Quality of Software Architectures and Architecting Critical Systems, 2011, pp. 33–42Google Scholar
  27. 27.
    Meedeniya, I., Moser, I., Aleti, A., Grunske, L.: Architecture-based reliability evaluation under uncertainty. In: Proceedings of the Joint ACM SIGSOFT Conference and ACM SIGSOFT Symposium on Quality of Software Architectures and Architecting Critical Systems, 2011, pp. 85–94Google Scholar
  28. 28.
    Meedeniya, I., Aleti, A., Grunske, L.: Architecture-driven reliability optimization with uncertain model parameters. J. Syst. Softw. 85(10), 2340–2355 (2012)CrossRefGoogle Scholar
  29. 29.
    Meedeniya, I., Aleti, A., Avazpour, I., Amin, A.: Robust ArcheOpterix: architecture optimization of embedded systems under uncertainty. In: Proceedings of 2012 2nd International Workshop on Software Engineering for Embedded Systems, 2012, pp. 23–29Google Scholar
  30. 30.
    Filieri, A., Ghezzi, C., Grassi, V., Mirandola, R.: Reliability analysis of component-based systems with multiple failure modes. In: Grunske, L., Reussner, R., Plasil, F. (eds.) Component-Based Software Engineering, pp. 1–20. Springer, Berlin (2010)Google Scholar
  31. 31.
    Halkidis, S.T., Tsantalis, N., Chatzigeorgiou, A., Stephanides, G.: Architectural risk analysis of software systems based on security patterns. Dependable Secur. Comput. IEEE Trans. 5(3), 129–142 (2008)CrossRefGoogle Scholar
  32. 32.
    Kazman, R., Klein, M., Barbacci, M., Longstaff, T., Lipson, H., Carriere, J.: The architecture tradeoff analysis method. In: Proceedings of the Fourth IEEE International Conference on Engineering of Complex Computer Systems, 1998, pp. 68–78Google Scholar
  33. 33.
    Kazman, R., Asundi, J., Klein, M.: Quantifying the costs and benefits of architectural decisions. In: Proceedings of the International Conference on Software Engineering, 2001, pp. 297–306Google Scholar
  34. 34.
    Al-Naeem, T., Gorton, I., Babar, M.a., Rabhi, F., Benatallah, B.: A quality-driven systematic approach for architecting distributed software applications. In: Proceedings of the 27th International Conference on Software Engineering, 2005, pp. 244–253Google Scholar
  35. 35.
    Saaty, T.L.: The Analytic Hierarchy Process. Applications and Studies, Berlin (1980)zbMATHGoogle Scholar
  36. 36.
    Lima Junior, F.R., Osiro, L., Carpinetti, L.C.R.: A comparison between fuzzy AHP and fuzzy TOPSIS methods to supplier selection. Appl. Soft Comput. 21, 194–209 (2014)CrossRefGoogle Scholar
  37. 37.
    Ertugrul, I., Karakaşoglu, N.: Comparison of fuzzy AHP and fuzzy TOPSIS methods for facility location selection. Int. J. Adv. Manuf. Technol. 39(7–8), 783–795 (2008)CrossRefGoogle Scholar
  38. 38.
    Sentz, K., Ferson, S.: Combination of Evidence in Dempster–Shafer Theory, vol. 835. Sandia National Laboratories, Albuquerque (2002)CrossRefGoogle Scholar
  39. 39.
    Merlin, P.M., Farber, D.J.: Recoverability of communication protocols: implications of a theoretical study. IEEE Trans. Commun. 24(9), 1036–1043 (1976)MathSciNetCrossRefzbMATHGoogle Scholar
  40. 40.
    Bernardi, S., Campos, J.: Computation of performance bounds for real-time systems using time Petri nets. IEEE Trans. Ind. Informatics 5(2), 168–180 (2009)CrossRefGoogle Scholar
  41. 41.
    Murata, T.: Petri nets: properties, analysis and applications. Proc. IEEE 77, 541–580 (1989)CrossRefGoogle Scholar
  42. 42.
    Coello Coello, C.A., Dhaenens, C., Jourdan, L.: Multi-objective combinatorial optimization: problematic and context. Stud. Comput. Intell. 272, 1–21 (2010)zbMATHGoogle Scholar
  43. 43.
    Etemaadi, R., Emmerich, M.T.M., Chaudron, M.R.V.: Problem-specific search operators for metaheuristic software architecture design. In: International Symposium on Search Based Software Engineering, 2012, pp. 267–272Google Scholar
  44. 44.
    Sedaghatbaf, A.: SQME Tool Web Page (2018). [Online]. Available:
  45. 45.
    Gérard, S., Dumoulin, C., Tessier, P., Selic, B.: 19 Papyrus: a UML2 tool for domain-specific language modeling. In: Model-Based Engineering of Embedded Real-Time Systems, pp. 361–368. Springer, Berlin Heidelberg (2010)Google Scholar
  46. 46.
    DICE Profiles 2016. [Online]. Available:
  47. 47.
    MOEA Framework. [Online]. Available:
  48. 48.
    Petriu, D.B., Woodside, M.: An intermediate metamodel with scenarios and resources for generating performance models from UML designs. Softw. Syst. Model. 6(2), 163–184 (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Trustworthy Computing Laboratory, School of Computer EngineeringIran University of Science and TechnologyTehranIran
  2. 2.School of Computer EngineeringIran University of Science and TechnologyTehranIran

Personalised recommendations