Advertisement

On the Impact of the Model-Based Representation of Inconsistencies to Manual Reviews

Results from a Controlled Experiment
  • Marian Daun
  • Jennifer Brings
  • Thorsten Weyer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10650)

Abstract

To ensure fulfilling stakeholder wishes, it is crucial to validate the documented requirements. This is often complicated by the fact that the wishes and intentions of different stakeholders are somewhat contradictory, which manifests itself in inconsistent requirements. To aid requirements engineers in identifying and resolving inconsistent requirements, we investigated the usefulness for manual reviews of two different model-based representation formats for inconsistent requirements; one that represent the inconsistent requirements in separate diagrams and one that represents them integrated into one diagram using annotations. The results from a controlled experiment show that the use of such integrated review diagrams can significantly increase efficiency of manual reviews, without sacrificing effectiveness.

Keywords

Requirements validation Inconsistencies Controlled experiment 

Notes

Acknowledgment

This research was partly funded by the German Federal Ministry of Education and Research (grant no. 01IS16043V and grant no. 01IS12005C). We thank Stefan Beck and Arnaud Boyer (Airbus Defence and Space), Jens Höfflinger (Bosch), and Karsten Albers (inchron) for their support regarding the adoption of industrial specifications to fit as experiment material.

References

  1. 1.
    France, R., Rumpe, B.: Model-driven development of complex software: a research roadmap. In: 2007 Future of Software Engineering, pp. 37–54. IEEE Computer Society, Washington, DC (2007)Google Scholar
  2. 2.
    Loniewski, G., Insfran, E., Abrahão, S.: A systematic review of the use of requirements engineering techniques in model-driven development. In: Petriu, D.C., Rouquette, N., Haugen, Ø. (eds.) MODELS 2010. LNCS, vol. 6395, pp. 213–227. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-16129-2_16CrossRefGoogle Scholar
  3. 3.
    Finkelstein, A., Goedicke, M., Kramer, J., Niskier, C.: ViewPoint oriented software development: methods and viewpoints in requirements engineering. In: Bergstra, J.A., Feijs, L.M.G. (eds.) Algebraic Methods II: Theory, Tools and Applications. LNCS, vol. 490, pp. 29–54. Springer, Heidelberg (1991). doi: 10.1007/3-540-53912-3_17CrossRefGoogle Scholar
  4. 4.
    ISO/IEC/IEEE: ISO/IEC/IEEE 29148:2011 - Systems and software engineering – Life cycle processes – Requirements engineering (2011)Google Scholar
  5. 5.
    Daun, M., Salmon, A., Weyer, T., Pohl, K.: The impact of students’ skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students. In: Proceedings of International Conference on Evaluation and Assessment in Software Engineering, pp. 29:1–29:6. ACM (2015)Google Scholar
  6. 6.
    Klein, J., Caillaud, B., Hélouët, L.: Merging scenarios. Electron. Notes Theor. Comput. Sci. 133, 193–215 (2005)CrossRefGoogle Scholar
  7. 7.
    Denger, C., Ciolkowski, M.: High quality statecharts through tailored, perspective-based inspections. In: 29th EUROMICRO Conference 2003, New Waves in System Architecture, pp. 316–325. IEEE Computer Society (2003)Google Scholar
  8. 8.
    Binder, R.V.: Testing Object-Oriented Systems: Models, Patterns, and Tools. Addison-Wesley, Reading (1999)Google Scholar
  9. 9.
    Travassos, G., Shull, F., Fredericks, M., Basili, V.R.: Detecting defects in object-oriented designs: using reading techniques to increase software quality. In: Proceedings of the 1999 ACM SIGPLAN Conference on Object-Oriented Programming Systems, Languages & Applications (OOPSLA 1999), pp. 47–56. ACM (1999)CrossRefGoogle Scholar
  10. 10.
    Daun, M., Weyer, T., Pohl, K.: Detecting and correcting outdated requirements in function-centered engineering of embedded systems. In: Fricker, S.A., Schneider, K. (eds.) REFSQ 2015. LNCS, vol. 9013, pp. 65–80. Springer, Cham (2015). doi: 10.1007/978-3-319-16101-3_5CrossRefGoogle Scholar
  11. 11.
    Miller, J., Wood, M., Roper, M.: Further experiences with scenarios and checklists. Empir. Softw. Eng. 3, 37–64 (1998)CrossRefGoogle Scholar
  12. 12.
    Basili, V.R., Green, S., Laitenberger, O., Lanubile, F., Shull, F., Sørumgård, S., Zelkowitz, M.V.: The empirical investigation of perspective-based reading. Empir. Softw. Eng. 1, 133–164 (1996)CrossRefGoogle Scholar
  13. 13.
    Daun, M., Brings, J., Weyer, T.: On the impact of the model-based representation of inconsistencies to manual reviews: results from a controlled experiment - extended version. arXiv:1707.02907 Cs (2017)CrossRefGoogle Scholar
  14. 14.
    Daun, M., Salmon, A., Bandyszak, T., Weyer, T.: Common threats and mitigation strategies in requirements engineering experiments with student participants. In: Daneva, M., Pastor, O. (eds.) REFSQ 2016. LNCS, vol. 9619, pp. 269–285. Springer, Cham (2016). doi: 10.1007/978-3-319-30282-9_19CrossRefGoogle Scholar
  15. 15.
    Höst, M., Regnell, B., Wohlin, C.: Using students as subjects-a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5, 201–214 (2000)CrossRefGoogle Scholar
  16. 16.
    Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects an empirical evaluation. In: Proceedings of the 2nd International Symposium on Empirical Software Engineering and Measurement, ESEM 2008, pp. 288–290. ACM (2008)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.paluno – The Ruhr Institute for Software TechnologyUniversity of Duisburg-EssenEssenGermany

Personalised recommendations