Empirical Software Engineering

, Volume 20, Issue 2, pp 336–373 | Cite as

Understanding the impact of rapid releases on software quality

The case of firefox
  • Foutse Khomh
  • Bram Adams
  • Tejinder Dhaliwal
  • Ying Zou


Many software companies are shifting from the traditional multi-month release cycle to shorter release cycles. For example, Google Chrome and Mozilla Firefox release new versions every 6 weeks. These shorter release cycles reduce the users’ waiting time for a new release and offer better feedback and marketing opportunities to companies, but it is unclear if the quality of the software product improves as well, since developers and testers are under more pressure. In this paper, we extend our previous empirical study of Mozilla Firefox on the impact of rapid releases on quality assurance with feedback by Mozilla project members. The study compares crash rates, median uptime, and the proportion of pre- and post-release bugs in traditional releases with those in rapid releases, and we also analyze the source code changes made by developers to identify potential changes in the development process. We found that (1) with shorter release cycles, users do not experience significantly more pre- or post-release bugs (percentage-wise) and (2) bugs are fixed faster, yet (3) users experience these bugs earlier during software execution (the program crashes earlier). Increased integration activity and propagation of harder bugs to later versions account for some of these findings. Overall, our case study suggests that a clear release engineering process with thorough automation is one of the major challenges when switching to rapid releases.


Software release Release cycle Software quality Testing Bugs 



We would like to thank the two Mozilla QA engineers who provided feedback on our findings. Their statements are accounts of personal experience and opinion, and are in no means whatsoever an official statement from Mozilla.


  1. HP (2011) Shorten release cycles by bringing developers to application lifecycle management. HP Applications Handbook, Retrieved on Febuary 08, 2012Google Scholar
  2. Mozilla (2011) Mozilla puts out firefox 5.0 web browser which carries over 1,000 improvements in just about 3 months of development. InvestmentWatch on June 25th, 2011. Retrieved on January 12, 2012Google Scholar
  3. Beck K, Andres C (2004) Extreme programming explained: embrace change 2nd edn. Addison-WesleyGoogle Scholar
  4. Shankland S (2011) Rapid-release firefox meets corporate backlash.
  5. Kaply M (2011) Why do companies stay on old technology? Retrieved on January 12, 2012Google Scholar
  6. Shankland S (2011) Mozilla proposes not-so-rapid-release firefox. CNET. Retrieved on February 08, 2012Google Scholar
  7. Vaughan-Nichols SJ (2012) The truth about goobuntu: Google’s in-house desktop ubuntu linux.
  8. Baysal O, Davis I, Godfrey MW (2011) A tale of two browsers. In: Proceedings of the 8th working conference on mining software repositories (MSR). pp. 238–241Google Scholar
  9. Porter A, Yilmaz C, Memon AM, Krishna AS, Schmidt DC, Gokhale A (2006) Techniques and processes for improving the quality and performance of open-source software. Softw Process Improv Pract 11:163–176CrossRefGoogle Scholar
  10. Downer T (2011) Some clarification and musings. Accessed 6 Jan 2012Google Scholar
  11. Li PL, Kivett R, Zhan Z, Jeon Se, Nagappan N, Murphy B, Ko AJ (2012) Characterizing the differences between pre- and post- release versions of software. In: Proceedings of the 33rd international conference on software engineering (ICSE). pp 716–725Google Scholar
  12. Khomh F, Dhaliwal T, Zou Y, Adams B (2012) Do faster releases improve software quality? an empirical case study of mozilla firefox. In: Proceedings of the 9th working conference on mining software repositories (MSR). pp. 179–188Google Scholar
  13. Ltd. RS (2012) Web browsers (global marketshare). Roxr Software Ltd. Retrieved on January 12, 2012Google Scholar
  14. Shankland S (2010) Google ethos speeds up chrome release cycle.
  15. Sicore D (2011) New channels for firefox rapid releases. The Mozilla Blog. 2011-04-13. Retrieved on January 12, 2012Google Scholar
  16. Rouget P (2012) Shaping a firefox feature - how does it work?
  17. Mozilla (2013) Auto-tools/automation development.
  18. Mäntylä M, Khomh F, Adams B, Engstrom E, Petersen K (2013) On rapid releases and software testing. In: Proceedings of the 29th IEEE international conference on software maintenance (ICSM). Eindhoven, The Netherlands. To appearGoogle Scholar
  19. Paul R (2011) Mozilla outlines 16-week firefox development cycle. Accessed 6 Jan 2012Google Scholar
  20. Mozilla (2011) Socorro: Mozilla’s crash reporting system. Accessed 29 March 2011
  21. Khomh F, Chan B, Zou Y, Hassan AE (2011) An entropy evaluation approach for triaging field crashes: A case study of mozilla firefox. In: Proceedings of the 18th working conference on reverse engineering (WCRE)Google Scholar
  22. Herraiz I, Shihab E, Nguyen THD, Hassan AE (2011) Impact of installation counts on perceived quality: a case study on debian. In: Proceedings of the 18th working conference on reverse engineering (WCRE). pp 219–228Google Scholar
  23. Hollander M, Wolfe DA (1999) Nonparametric statistical methods, 2nd edn. WileyGoogle Scholar
  24. Marschall M (2007) Transforming a six month release cycle to continuous flow. In: Proceedings of the conference on AGILE. pp 395–400Google Scholar
  25. Śliwerski J, Zimmermann T, Zeller A (2005) When do changes induce fixes? In: Proceedings of the 2005 international workshop on mining software repositories (MSR). pp 1–5Google Scholar
  26. Fox J (2008) Applied regression analysis and generalized linear, 2nd edn. Sage PublicationsGoogle Scholar
  27. Johnson JW (2000) A heuristic method for estimating the relative weight of predictor variables in multiple regression. Multivar Behav Res 35:1–19CrossRefGoogle Scholar
  28. Yin RK (2002) Case study research: design and methods, 3rd edn. SAGE PublicationsGoogle Scholar
  29. Software C. (2012) Sourcemonitor. Accessed 12 Jan 2012Google Scholar
  30. Otte T, Moreton R, Knoell HD (2008) Applied quality assurance methods under the open source development model. In: Proceedings of the 32nd Annual IEEE international computer software and applications conference (COMPSAC). pp 1247–1252Google Scholar
  31. Brown AW (2011) A case study in agile-at-scale delivery. In: Proceedings of the 12th international conference on agile processes in software engineering and extreme programming (XP), vol 77. pp 266–281Google Scholar
  32. Jenkins J (2011) Velocity culture (the unmet challenge in ops). Presentation at O’Reilly velocity conferenceGoogle Scholar
  33. Gamma E (2005) Agile, open source, distributed, and on-time – inside the eclipse development process. Keynote at the 27th international conference on software engineering (ICSE)Google Scholar
  34. van der Storm T (2005) Continuous release and upgrade of component-based software. In: Proceedings of the 12th international workshop on software configuration management (SCM). pp 43–57Google Scholar
  35. Dolstra E, de Jonge M, Visser E (2004) Nix: a safe and policy-free system for software deployment. In: Proceedings of the 18th USENIX conference on system admin. pp 79–92Google Scholar
  36. Duvall P, Matyas SM, Glover A (2007) Continuous Integration: improving software quality and reducing risk. Addison-Wesley ProfessionalGoogle Scholar
  37. Humble J, Farley D (2010) Continuous delivery: reliable software releases through build, test, and deployment automation, 1st edn. Addison-Wesley ProfessionalGoogle Scholar
  38. Kong S, Kendall JE, Kendall KE (2009) The challenge of improving software quality: developers’ beliefs about the contribution of agile practices. In: Proceedings of the Americas conference on information systems (AMCIS). 12pGoogle Scholar
  39. VersionOne (2009) 4th annual state of agile survey.
  40. Hodgetts P., Phillips D. (2002) 30. in: extreme adoption experiences of a B2B start up. Addison-Wesley Longman Publishing Co. Inc. Extreme programming perspectivesGoogle Scholar
  41. Kuppuswami S, Vivekanandan K, Ramaswamy P, Rodrigues P (2003) The effects of individual xp practices on software development effort. SIGSOFT Softw Eng Notes 28:6CrossRefGoogle Scholar
  42. Stewart KJ, Darcy DP, Daniel SL (2005) Observations on patterns of development in open source software projects. SIGSOFT Softw Eng Notes 30:1–5CrossRefGoogle Scholar
  43. Jansen S, Brinkkemper S (2006) Ten misconceptions about product software release management explained using update cost/value functions. In: Proceedings of the international workshop on software product management. pp 44–50Google Scholar
  44. Zaman S, Adams B, Hassan AE (2012) A qualitative study on performance bugs. In: Proceedings of the 9th IEEE working conference on mining software repositories (MSR). Zurich, pp 199–208Google Scholar
  45. Lu S, Park S, Seo E, Zhou Y (2008) Learning from mistakes: a comprehensive study on real world concurrency bug characteristics. In: Proceedings of the 13th international conference on architectural support for programming languages and operating systems (ASPLOS). pp. 329–339Google Scholar
  46. Thung F, Wang S, Lo D, Jiang L (2012) An empirical study of bugs in machine learning systems. In: Proceedings of the 23rd IEEE international symposium on software reliability engineering (ISSRE). 271–280Google Scholar
  47. Sahoo SK, Criswell J, Adve V (2010) An empirical study of reported bugs in server software with implications for automated bug diagnosis. In: Proceedings of the 32nd ACM/IEEE international conference on software engineering (ICSE), vol 1. pp 485–494Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Foutse Khomh
    • 1
  • Bram Adams
    • 2
  • Tejinder Dhaliwal
    • 3
  • Ying Zou
    • 3
  1. 1.SWAT, Polytechnique MontréalQuébecCanada
  2. 2.MCIS, Polytechnique MontréalQuébecCanada
  3. 3.Department of Electrical and Computer EngineeringQueen’s UniversityKingstonCanada

Personalised recommendations