Advertisement

Behavior-Derived Variability Analysis: Mining Views for Comparison and Evaluation

  • Iris Reinhartz-BergerEmail author
  • Ilan Shimshoni
  • Aviva Abdal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11483)

Abstract

The large variety of computerized solutions (software and information systems) calls for a systematic approach to their comparison and evaluation. Different methods have been proposed over the years for analyzing the similarity and variability of systems. These methods get artifacts, such as requirements, design models, or code, of different systems (commonly in the same domain), identify and calculate their similarities, and represent the variability in models, such as feature diagrams. Most methods rely on implementation considerations of the input systems and generate outcomes based on predefined, fixed strategies of comparison (referred to as variability views). In this paper, we introduce an approach for mining relevant views for comparison and evaluation, based on the input artifacts. Particularly, we equip SOVA – a Semantic and Ontological Variability Analysis method – with data mining techniques in order to identify relevant views that highlight variability or similarity of the input artifacts (natural language requirement documents). The comparison is done using entropy and Rand index measures. The method and its outcomes are evaluated on a case of three photo sharing applications.

Keywords

Software Product Line Engineering Variability analysis Requirements specifications Feature diagrams 

References

  1. 1.
    Assunção, W.K., Lopez-Herrejon, R.E., Linsbauer, L., Vergilio, S.R., Egyed, A.: Reengineering legacy applications into software product lines: a systematic mapping. Empirical Softw. Eng. 22(6), 2972–3016 (2017)CrossRefGoogle Scholar
  2. 2.
    Bakar, N.H., Kasirun, Z.M., Salleh, N., Jalab, H.A.: Extracting features from online software reviews to aid requirements reuse. Appl. Soft Comput. 49, 1297 (2016)CrossRefGoogle Scholar
  3. 3.
    Bakar, N.H., Kasirun, Z.M., Salleh, N., Jalab, H.A.: Feature extraction approaches from natural language requirements for reuse in software product lines: a systematic literature review. J. Syst. Softw. 106, 132–149 (2015)CrossRefGoogle Scholar
  4. 4.
    Ben Nasr, S., et al.: Automated extraction of product comparison matrices from informal product descriptions. J. Syst. Softw. 124, 82–103 (2017)CrossRefGoogle Scholar
  5. 5.
    Berger, T., et al.: A survey of variability modeling in industrial practice. In: Proceedings of the Seventh International Workshop on Variability Modelling of Software-intensive Systems, p. 7. ACM, January 2013Google Scholar
  6. 6.
    Bonin, F., Dell’Orletta, F., Montemagni, S., Venturi, G.: A contrastive approach to multi-word extraction from domain-specific corpora. In: Proceedings of the Seventh Conference on International Language Resources and Evaluation (LREC 2010) (2010)Google Scholar
  7. 7.
    Clements, P., Northrop, L.: Software Product Lines: Practices and Patterns, vol. 3. Addison-Wesley, Reading (2002)Google Scholar
  8. 8.
    Davril, J.M., Delfosse, E., Hariri, N., Acher, M., Cleland-Huang, J., Heymans, P.: Feature model extraction from large collections of informal product descriptions. In: Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering, pp. 290–300. ACM, August 2013Google Scholar
  9. 9.
    Deerwester, S., Dumais, S.T., Furnas, G.W., Landauer, T.K., Harshman, R.: Indexing by latent semantic analysis. J. Am. Soc. Inf. Sci. 41(6), 391–407 (1990)CrossRefGoogle Scholar
  10. 10.
    Ferrari, A., Spagnolo, G.O., Dell’Orletta, F.: Mining commonalities and variabilities from natural language documents. In: Proceedings of the 17th International Software Product Line Conference, pp. 116–120. ACM, August 2013Google Scholar
  11. 11.
    Itzik, N., Reinhartz-Berger, I.: SOVA-a tool for semantic and ontological variability analysis. In: CAiSE (Forum/Doctoral Consortium), pp. 177–184 (2014)Google Scholar
  12. 12.
    Itzik, N., Reinhartz-Berger, I.: Generating feature models from requirements: structural vs. functional perspectives. In: Proceedings of the 18th International Software Product Line Conference: Companion, Workshops, Demonstrations and Tools, vol. 2, pp. 44–51. ACM, September 2014Google Scholar
  13. 13.
    Itzik, N., Reinhartz-Berger, I., Wand, Y.: Variability analysis of requirements: considering behavioral differences and reflecting stakeholders’ perspectives. IEEE Trans. Softw. Eng. 42(7), 687–706 (2016)CrossRefGoogle Scholar
  14. 14.
    Kang, K.C., Cohen, S.G., Hess, J.A., Novak, W.E., Peterson, A.S.: Feature-oriented domain analysis (FODA) feasibility study (No. CMU/SEI-90-TR-21). Software Engineering Institute, Carnegie-Mellon University, Pittsburgh, PA (1990)Google Scholar
  15. 15.
    Martinez, J., Ziadi, T., Bissyandé, T.F., Klein, J., Le Traon, Y.: Bottom-up adoption of software product lines: a generic and extensible approach. In: Proceedings of the 19th International Conference on Software Product Line, pp. 101–110. ACM, July 2015Google Scholar
  16. 16.
    Mihalcea, R., Corley, C., Strapparava, C.: Corpus-based and knowledge-based measures of text semantic similarity. In: AAAI, vol. 6, pp. 775–780, July 2006Google Scholar
  17. 17.
    Niu, N., Savolainen, J., Niu, Z., Jin, M., Cheng, J.R.C.: a systems approach to product line requirements reuse. IEEE Syst. J. 8(3), 827–836 (2014)CrossRefGoogle Scholar
  18. 18.
    OMG: The Requirements Interchange Format Specification – Version 1.2. https://www.omg.org/spec/ReqIF/
  19. 19.
    Pohl, K., Böckle, G., van Der Linden, F.J.: Software Product Line Engineering: Foundations, Principles and Techniques. Springer, Heidelberg (2005).  https://doi.org/10.1007/3-540-28901-1CrossRefzbMATHGoogle Scholar
  20. 20.
    Rand, W.M.: Objective criteria for the evaluation of clustering methods. J. Am. Stat. Assoc. 66(336), 846–850 (1971)CrossRefGoogle Scholar
  21. 21.
    Reinhartz-Berger, I., Itzik, N., Wand, Y.: Analyzing variability of software product lines using semantic and ontological considerations. In: Jarke, M., et al. (eds.) CAiSE 2014. LNCS, vol. 8484, pp. 150–164. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-07881-6_11CrossRefGoogle Scholar
  22. 22.
    Steinbach, M., Karypis, G., Kumar, V.: A comparison of document clustering techniques. In: KDD Workshop on Text Mining, vol. 400, no. 1, pp. 525–526, August 2000Google Scholar
  23. 23.
    Steinley, D.: Properties of the Hubert-Arable adjusted Rand index. Psychol. Methods 9(3), 386 (2004)CrossRefGoogle Scholar
  24. 24.
    Wu, Z., Palmer, M.: Verbs semantics and lexical selection. In: Proceedings of the 32nd Annual Meeting on Association for Computational Linguistics, pp. 133–138. Association for Computational Linguistics, June 1994Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Iris Reinhartz-Berger
    • 1
    Email author
  • Ilan Shimshoni
    • 1
  • Aviva Abdal
    • 1
  1. 1.Department of Information SystemsUniversity of HaifaHaifaIsrael

Personalised recommendations