Abstract
Microservices have emerged as an architectural style for developing distributed applications. Assessing the performance of architectural deployment alternatives is challenging and must be aligned with the system usage in the production environment. In this paper, we introduce an approach for using operational profiles to generate load tests to automatically assess scalability pass/fail criteria of several microservices deployment alternatives. We have evaluated our approach with different architecture deployment alternatives using extensive lab studies in a large bare metal host environment and a virtualized environment. The data presented in this paper supports the need to carefully evaluate the impact of increasing the level of computing resources on performance. Specifically, for the case study presented in this paper, we observed that the evaluated performance metric is a non-increasing function of the number of CPU resources for one of the environments under study.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Aderaldo, C.M., Mendona, N.C., Pahl, C., Jamshidi, P.: Benchmark requirements for microservices architecture research. In: Proceedings of ECASE@ICSE, pp. 8–13. IEEE
Alshuqayran, N., Ali, N., Evans, R.: A systematic mapping study in microservice architecture. In: Proceedings of SOCA, pp. 44–51 (2016)
Avritzer, A., Bondi, A.B., Grottke, M., Trivedi, K.S., Weyuker, E.J.: Performance assurance via software rejuvenation: monitoring, statistics and algorithms. In: Proceedings of DSN, pp. 435–444 (2006)
Avritzer, A., Ferme, V., Janes, A., Russo, B., Schulz, H., van Hoorn, A.: Reprodicibility package for “a quantitative approach for the assessment of microservice architecture deployment alternatives using automated performance testing”. https://doi.org/10.5281/zenodo.1256467
Avritzer, A., Weyuker, E.J.: The automatic generation of load test suites and the assessment of the resulting software. IEEE Trans. Softw. Eng. 21(9), 705–716 (1995)
Casalicchio, E., Perciballi, V.: Auto-scaling of containers: the impact of relative and absolute metrics. In: Proceedings of FAS*W@SASO/ICCAC, pp. 207–214 (2017)
Esposito, C., Castiglione, A., Choo, K.K.R.: Challenges in delivering software in the cloud as microservices. IEEE Cloud Comp. 3(5), 10–14 (2016)
Ferme, V., Pautasso, C.: A declarative approach for performance tests execution in continuous software development environments. In: Proceedings of ACM/SPEC ICPE, pp. 261–272 (2018)
Francesco, P.D., Malavolta, I., Lago, P.: Research on architecting microservices: trends, focus, and potential for industrial adoption. In: Proceedings of ICSA, pp. 21–30 (2017)
Jiang, Z.M., Hassan, A.E.: A survey on load testing of large-scale software systems. IEEE Trans. Softw. Eng. 41(11), 1091–1118 (2015)
Kozhirbayev, Z., Sinnott, R.O.: A performance comparison of container-based technologies for the cloud. Future Gener. Comp. Syst. 68, 175–182 (2017)
McGrath, G., Brenner, P.R.: Serverless computing: design, implementation, and performance. In: Proceedings of ICDCSW, pp. 405–410 (2017)
Newman, S.: Building Microservices, 1st edn. O’Reilly Media Inc., Newton (2015)
Pahl, C., Jamshidi, P.: Microservices: A systematic mapping study. In: Proceedings of CLOSER, pp. 137–146 (2016)
Taylor, R.N., Medvidovic, N., Dashofy, E.M.: Software Architecture: Foundations, Theory and Practice. Wiley, Hoboken (2009)
Ueda, T., Nakaike, T., Ohara, M.: Workload characterization for microservices. In: Proceedings of IISWC, pp. 1–10 (2016)
Vögele, C., van Hoorn, A., Schulz, E., Hasselbring, W., Krcmar, H.: WESSBAS: extraction of probabilistic workload specifications for load testing and performance prediction–a model-driven approach for session-based application systems. Softw. Syst. Modeling 17(2), 443–477 (2018)
Weyuker, E.J., Avritzer, A.: A metric for predicting the performance of an application under a growing workload. IBM Syst. J. 41(1), 45–54 (2002)
Weyuker, E.J., Jeng, B.: Analyzing partition testing strategies. IEEE Trans. Softw. Eng. 17(7), 703–711 (1991)
Acknowledgements
This work has been partly supported by EsulabSolutions, Inc., the German Federal Ministry of Education and Research (grant no. 01IS17010, ContinuITy), German Research Foundation (HO 5721/1-1, DECLARE), the GAUSS national research project, which has been funded by the MIUR under the PRIN 2015 program (Contract 2015KWREMX), and by the Swiss National Science Foundation (project no. 178653). The authors would like to thank the HPI Future SOC Lab (period fall 2017) for providing the infrastructure.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Avritzer, A., Ferme, V., Janes, A., Russo, B., Schulz, H., van Hoorn, A. (2018). A Quantitative Approach for the Assessment of Microservice Architecture Deployment Alternatives by Automated Performance Testing. In: Cuesta, C., Garlan, D., Pérez, J. (eds) Software Architecture. ECSA 2018. Lecture Notes in Computer Science(), vol 11048. Springer, Cham. https://doi.org/10.1007/978-3-030-00761-4_11
Download citation
DOI: https://doi.org/10.1007/978-3-030-00761-4_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-00760-7
Online ISBN: 978-3-030-00761-4
eBook Packages: Computer ScienceComputer Science (R0)