Advertisement

Journal of Computer Science and Technology

, Volume 34, Issue 2, pp 416–436 | Cite as

Decomposing Composite Changes for Code Review and Regression Test Selection in Evolving Software

  • Bo Guo
  • Young-Woo Kwon
  • Myoungkyu SongEmail author
Regular Paper
  • 12 Downloads

Abstract

Inspecting and testing code changes typically require a significant amount of developer effort. As a system evolves, developers often create composite changes by mixing multiple development issues, as opposed to addressing one independent issue — an atomic change. Inspecting composite changes often becomes time-consuming and error-prone. To test unrelated edits on composite changes, rerunning all regression tests may require excessive time. To address the problem, we present an interactive technique for change decomposition to support code reviews and regression test selection, called ChgCutter. When a developer specifies code change within a diff patch, ChgCutter partitions composite changes into a set of related atomic changes, which is more cohesive and self-contained regarding the issue being addressed. For composite change inspection, it generates an intermediate program version that only includes a related change subset using program dependence relationships. For cost reduction during regression testing, it safely selects only affected tests responsible for changes to an intermediate version. In the evaluation, we apply ChgCutter to 28 composite changes in four open source projects. ChgCutter partitions these changes with 95.7% accuracy, while selecting affected tests with 89.0% accuracy. We conduct a user study with professional software engineers at PayPal and find that ChgCutter is helpful in understanding and validating composite changes, scaling to industry projects.

Keywords

software maintenance and evolution code review test selection program slicing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2019_1917_MOESM1_ESM.pdf (431 kb)
ESM 1 (PDF 430 kb)

References

  1. [1]
    Bacchelli A, Bird C. Expectations, outcomes, and challenges of modern code review. In Proc. the 35th Int. Conf. Softw. Eng., May 2013, pp.712-721.Google Scholar
  2. [2]
    Tao Y D, Dang Y N, Xie T, Zhang D M, Kim S H. How do software engineers understand code changes?: An exploratory study in industry. In Proc. the 20th Int. Conf. the Foundations of Softw. Eng., Nov. 2012, Article No. 51.Google Scholar
  3. [3]
    Leung H K N,White L. Insights into regression testing (software testing). In Proc. Int. Conf. Software Maintenance, Oct. 1989, pp.60-69.Google Scholar
  4. [4]
    Zimmermann T, Zeller A, Weissgerber P, Diehl S. Mining version histories to guide software changes. IEEE Trans. Softw. Eng., 2005, 31(6): 429-445.CrossRefGoogle Scholar
  5. [5]
    Brun Y, Holmes R, Ernst M D, Notkin D. Proactive detection of collaboration conflicts. In Proc. the 19th ACM SIGSOFT Symposium on the Foundations of Softw. Eng. and the 13th European Software Engineering Conference, Sept. 2011, pp.168-178.Google Scholar
  6. [6]
    Parnas D L, Lawford M. The role of inspection in software quality assurance. IEEE Trans. Softw. Eng., 2003, 29(8): 674-676.CrossRefGoogle Scholar
  7. [7]
    Tao Y D, Kim S H. Partitioning composite code changes to facilitate code review. In Proc. the 12th IEEE/ACM Working Conf. Mining Software Repositories, May 2015, pp.180-190.Google Scholar
  8. [8]
    Dias M, Bacchelli A, Gousios G, Cassou D, Ducasse S. Untangling fine-grained code changes. In Proc. the 22nd IEEE Int. Conf. Software Analysis, Evolution and Reengineering, Mar. 2015, pp.341-350.Google Scholar
  9. [9]
    Herzig K, Zeller A. The impact of tangled code changes. In Proc. the 10th Working Conference on Mining Software Repositories, May 2013, pp.121-130.Google Scholar
  10. [10]
    Herzig K, Just S, Zeller A. The impact of tangled code changes on defect prediction models. Empirical Softw. Eng., 2016, 21(2): 303-336.CrossRefGoogle Scholar
  11. [11]
    Herzig K, Zeller A. The impact of tangled code changes. In Proc. the 10th Int. Conf. Mining Software Repositories, May 2013, pp.121-130.Google Scholar
  12. [12]
    Barnett M, Bird C, Brunet J, Lahiri S K. Helping developers help themselves: Automatic decomposition of code review changesets. In Proc. the 37th IEEE/ACM Int. Conf. Softw. Eng., May 2015, pp.134-144.Google Scholar
  13. [13]
    Chesley O C, Ren X X, Ryder B G. Crisp: A debugging tool for Java programs. In Proc. the 21st IEEE Int. Conf. Software Maintenance, Sept. 2005, pp.401-410.Google Scholar
  14. [14]
    Bohner S A, Arnold R S. Software Change Impact Analysis (1st edition). Wiley-IEEE Computer Society Pr, 1996.Google Scholar
  15. [15]
    Guo B, Song M. Interactively decomposing composite changes to support code review and regression testing. In Proc. the 41st IEEE Annual Computer Software and Applications Computer, Jul. 2017, pp.118-127.Google Scholar
  16. [16]
    Adams R, Tichy W, Weinert A. The cost of selective recompilation and environment processing. ACM Trans. Softw. Eng. Methodol., 1994, 3(1): 3-28.CrossRefGoogle Scholar
  17. [17]
    Cooper K D, Kennedy K, Torczon L. Interprocedural optimization: Eliminating unnecessary recompilation. In Proc. the 1986 SIGPLAN Symp. Compiler Construction, June 1986, pp.58-67.Google Scholar
  18. [18]
    Dmitriev M. Language-specific make technology for the Java programming language. In Proc. the 2002 ACM SIGPLAN Conf. Object-Oriented Programming, Systems, Languages, and Applications, Nov. 2002, pp.373-385.Google Scholar
  19. [19]
    Tichy W F. Smart recompilation. ACM Trans. Program. Lang. Syst., 1986, 8(3): 273-291.MathSciNetCrossRefGoogle Scholar
  20. [20]
    Harrold M J, Jones J A, Li T Y, Liang D L, Orso A, Pennings M, Sinha S, Spoon S A, Gujarathi A. Regression test selection for Java software. In Proc. the 2001 ACM SIGPLAN Conf. Object Oriented Programming, Systems, Languages, and Applications, Oct. 2001, pp.312-326.Google Scholar
  21. [21]
    Gligoric M, Eloussi L, Marinov D. Practical regression test selection with dynamic file dependencies. In Proc. the 2005 Int. Symp. Software Testing and Analysis, Jul. 2015, pp.211-222.Google Scholar
  22. [22]
    Rothermel G, Harrold M J. A safe, efficient regression test selection technique. ACM Trans. Softw. Eng. Methodol., 1997, 6(2): 173-210.CrossRefGoogle Scholar
  23. [23]
    Zeller A. Yesterday, my program worked. Today, it does not. Why? In Proc. the 7th European Software Engineering Conference on the Software Engineering, Sept. 1999, pp.253-267.Google Scholar
  24. [24]
    Zhang T Y, Song M, Pinedo J, Kim M. Interactive code review for systematic changes. In Proc. the 37th IEEE/ACM Int. Conf. Softw. Eng., May 2015, pp.111-122.Google Scholar
  25. [25]
    Fowler M, Beck K, Brant J, Opdyke W, Roberts D, Gamma E. Refactoring: Improving the Design of Existing Code (1st edition). Addison-Wesley Professional, 1999.Google Scholar
  26. [26]
    Rigby P, Cleary B, Painchaud F, Storey M A, Germán D. Contemporary peer review in action: Lessons from open source development. IEEE Software, 2012, 29(6): 56-61.CrossRefGoogle Scholar
  27. [27]
    Horwitz S, Reps T, Binkley D. Interprocedural slicing using dependence graphs. ACM Trans. Program. Lang. Syst., 1990, 12(1): 26-60.CrossRefGoogle Scholar
  28. [28]
    Sinha S, Harrold M J, Rothermel G. System-dependence-graph-based slicing of programs with arbitrary interprocedural control flow. In Proc. the 1999 Int. Conf. Softw. Eng., May 1999, pp.432-441.Google Scholar
  29. [29]
    Canfora G, Cimitile A, De Lucia A, Di Lucca G A. Decomposing legacy programs: A first step towards migrating to client-server platforms. Journal of Systems and Software, 2000, 54(2): 99-110.CrossRefGoogle Scholar
  30. [30]
    De Lucia A, Fasolino A R, Munro M. Understanding function behaviors through program slicing. In Proc. the 4th International Workshop on Program Comprehension, Mar. 1996, pp.9-18.Google Scholar
  31. [31]
    Gallagher K B, Lyle J R. Using program slicing in software maintenance. IEEE Trans. Softw. Eng., 1991, 17(8): 751-761.CrossRefGoogle Scholar
  32. [32]
    Lanubile F, Visaggio G. Extracting reusable functions by flow graph based program slicing. IEEE Trans. Softw. Eng., 1997, 23(4): 246-259.CrossRefGoogle Scholar
  33. [33]
    Augsten N. RTED: A robust algorithm for the tree edit distance. Proceedings of the VLDB Endowment, 2011, 5(4): 334-345.CrossRefGoogle Scholar
  34. [34]
    Baxter I D, Pidgeon C, Mehlich M. DMS®: Program transformations for practical scalable software evolution. In Proc. the 26th Int. Conf. Softw. Eng., May 2004, pp.625-634.Google Scholar
  35. [35]
    Baxter I D, Yahin A, Moura L et al. Clone detection using abstract syntax trees. In Proc. the 1998 Int. Conf. Software Maintenance, Nov. 1998, pp.368-377.Google Scholar
  36. [36]
    Wahler V, Seipel D, von Gudenberg J W, Fischer G. Clone detection in source code by frequent itemset techniques. In Proc. the 4th Int. Workshop on the Source Code Analysis and Manipulation, Sept. 2004, pp.128-135.Google Scholar
  37. [37]
    Demaine E D, Mozes S, Rossman B, Weimann O. An optimal decomposition algorithm for tree edit distance. ACM Trans. Algorithms, 2009, 6(1): Article No. 2.Google Scholar
  38. [38]
    Zhang K Z, Shasha D. Simple fast algorithms for the editing distance between trees and related problems. SIAM Journal on Computing, 1989, 18(6): 1245-1262.MathSciNetCrossRefzbMATHGoogle Scholar
  39. [39]
    Myers E W. An O(ND) difference algorithm and its variations. Algorithmica, 1986, 1(1/2/3/4): 251-266.MathSciNetCrossRefzbMATHGoogle Scholar
  40. [40]
    Fluri B, Wursch M, PInzger M, Gall H C. Change distilling: Tree differencing for fine-grained source code change extraction. IEEE Trans. Softw. Eng., 2007, 33(11): 725-743.CrossRefGoogle Scholar
  41. [41]
    Krinke J. Identifying similar code with program dependence graphs. In Proc. the 8th Working Conf. Reverse Engineering, Oct. 2001, pp.301-309.Google Scholar
  42. [42]
    Leung H K N, White L. A cost model to compare regression test strategies. In Proc. Conf. Software Maintenance, Oct. 1991, pp.201-208.Google Scholar
  43. [43]
    Kiczales G, Lamping J, Menhdhekar A, Maeda C, Lopes C, Loingtier J M, Irwin J. Aspect-oriented programming. In Proc. the 11th European Conf. Object-Oriented Programming, Jun. 1997, pp.220-242.Google Scholar
  44. [44]
    Kiczales G, Hilsdale E, Hugunin J, Kersten M, Palm J, Griswold W G. An overview of AspectJ. In Proc. the 15th European Conf. Object-Oriented Programming, Jun. 2001, pp.327-353.Google Scholar
  45. [45]
    Kamiya T, Kusumoto S, Inoue Z. CCFinder: A multilinguistic token-based code clone detection system for large scale source code. IEEE Trans. Softw. Eng., 2002, 28(7): 654-670.CrossRefGoogle Scholar
  46. [46]
    Soares G. Making program refactoring safer. In Proc. the 32nd ACM/IEEE Int. Conf. Softw. Eng., May 2010, pp.521-522.Google Scholar
  47. [47]
    Soares G, Gheyi R, Massoni T. Automated behavioral testing of refactoring engines. IEEE Trans. Softw. Eng., 2013, 39(2): 147-162.CrossRefGoogle Scholar
  48. [48]
    Levenstein V I. Binary codes capable of correcting deletions, insertions, and reversals. Soviet Physics Doklady, 1966, 10(8): 707-710.MathSciNetGoogle Scholar
  49. [49]
    Seaman C B. Qualitative methods in empirical studies of software engineering. IEEE Trans. Softw. Eng., 1999, 25(4): 557-572.CrossRefGoogle Scholar
  50. [50]
    Murphy-Hill E, Parnin C, Black A P. How we refactor, and how we know it. IEEE Trans. Softw. Eng., 2012, 38(1): 5-18.CrossRefGoogle Scholar
  51. [51]
    Pacheco C, Ernst M D. Randoop: Feedback-directed random testing for Java. In Proc. the 22nd ACM SIGPLAN Conf. Object-Oriented Programming Systems and Applications Companion, Oct. 2007, pp.815-816.Google Scholar
  52. [52]
    Fraser G, Arcuri A. EvoSuite: Automatic test suite generation for object-oriented software. In Proc. the 19th ACM SIGSOFT Symposium on Foundations of Softw. Eng. and the 13th European Software Engineering Conference, Sept. 2011, pp.416-419.Google Scholar

Copyright information

© Springer Science+Business Media, LLC & Science Press, China 2019

Authors and Affiliations

  1. 1.Department of Global OperationsPayPalOmahaU.S.A.
  2. 2.School of Computer Science and EngineeringKyungpook National UniversityDaeguSouth Korea
  3. 3.Department of Computer ScienceUniversity of Nebraska at OmahaOmahaU.S.A.

Personalised recommendations