Abstract
Current research on performance awareness evaluates approaches primarily for their functional correctness but does not assess to what extent developers are supported in improving software implementations. This article presents the evaluation of an existing approach for supporting developers of Java Enterprise Edition (EE) applications with response time estimations based on a controlled human-oriented experiment. The main goal of the experiment is to quantify the effectiveness of employing the approach while optimizing the response time of an implementation. Subjects’ optimizations are quantified by the amount of fixed performance bugs. Having employed the approach, subjects fixed on average over three times more performance bugs. The results further indicate that in the absence of a performance awareness aid, the success of optimizing a previously unknown implementation is far less dependent of the behavior and skill level of the developer.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bulej, L., Bureš, T., Keznikl, J., Koubková, A., Podzimek, A., Tůma, P.: Capturing performance assumptions using stochastic performance logic. In: 3rd ACM/SPEC International Conference on Performance Engineering, pp. 311–322 (2012)
Cito, J., Leitner, P., Gall, H.C., Dadashi, A., Keller, A., Roth, A.: Runtime metric meets developer: building better cloud applications using feedback. In: ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software, pp. 14–27 (2015)
Danciu, A., Chrusciel, A., Brunnert, A., Krcmar, H.: Performance awareness in Java EE development environments. In: Beltrán, M., Knottenbelt, W., Bradley, J. (eds.) EPEW 2015. LNCS, vol. 9272, pp. 146–160. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-23267-6_10
Elliott, A.C., Woodward, W.A.: Statistical Analysis Quick Reference Guidebook: With SPSS Examples. Sage, Thousand Oaks (2007)
Heger, C., Happe, J., Farahbod, R.: Automated root cause isolation of performance regressions during software development. In: 4th ACM/SPEC International Conference on Performance Engineering, pp. 27–38 (2013)
Horký, V., Libic, P., Marek, L., Steinhauser, A., Tůma, P.: Utilizing performance unit tests to increase performance awareness. In: 6th International Conference on Performance Engineering, pp. 289–300 (2015)
Höst, M., Regnell, B., Wohlin, C.: Using students as subjects–a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5(3), 201–214 (2000)
Jin, G., Song, L., Shi, X., Scherpelz, J., Lu, S.: Understanding and detecting real-world performance bugs. SIGPLAN Not. 47(6), 77–88 (2012)
Kitchenham, B., Fry, J., Linkman, S.: The case against cross-over designs in software engineering. In: 11th International Workshop on Software Technology and Engineering Practice, pp. 65–67 (2003)
Kitchenham, B.A., et al.: Preliminary guidelines for empirical research in software engineering. IEEE Trans. Softw. Eng. 28(8), 721–734 (2002)
Minelli, R., Mocci, A., Lanza, M.: I know what you did last summer - an investigation of how developers spend their time. In: 23rd IEEE International Conference on Program Comprehension, pp. 25–35 (2015)
Reussner, R., Becker, S., Happe, J., Koziolek, H., Krogmann, K., Kuperberg, M.: The Palladio Component Model. Universität Karlsruhe (2007)
Robillard, M.P., Coelho, W., Murphy, G.C.: How effective developers investigate source code: an exploratory study. IEEE Trans. Softw. Eng. 30(12), 889–903 (2004)
Sanchez, H., Robbes, R., Gonzalez, V.M.: An empirical study of work fragmentation in software evolution tasks. In: 22nd IEEE International Conference on Software Analysis, Evolution, and Reengineering, pp. 251–260 (2015)
Smith, C.U., Williams, L.G.: More new software performance antipatterns: even more ways to shoot yourself in the foot. In: Computer Measurement Group Conference, pp. 717–725 (2003)
Tůma, P.: Performance awareness: keynote abstract. In: 5th ACM/SPEC International Conference on Performance Engineering, pp. 135–136 (2014)
Weiss, C., Westermann, D., Heger, C., Moser, M.: Systematic performance evaluation based on tailored benchmark applications. In: 4th ACM/SPEC International Conference on Performance Engineering, pp. 411–420 (2013)
Wert, A.: Performance problem diagnostics by systematic experimentation. Ph.D. thesis, KIT-Bibliothek (2015)
Wohlin, C., Runeson, P., Hst, M., Ohlsson, M.C., Regnell, B., Wessln, A.: Experimentation in Software Engineering. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-29044-2
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Danciu, A., Krcmar, H. (2018). To What Extent Does Performance Awareness Support Developers in Fixing Performance Bugs?. In: Bakhshi, R., Ballarini, P., Barbot, B., Castel-Taleb, H., Remke, A. (eds) Computer Performance Engineering. EPEW 2018. Lecture Notes in Computer Science(), vol 11178. Springer, Cham. https://doi.org/10.1007/978-3-030-02227-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-02227-3_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-02226-6
Online ISBN: 978-3-030-02227-3
eBook Packages: Computer ScienceComputer Science (R0)