On the Impact of the Model-Based Representation of Inconsistencies to Manual Reviews
To ensure fulfilling stakeholder wishes, it is crucial to validate the documented requirements. This is often complicated by the fact that the wishes and intentions of different stakeholders are somewhat contradictory, which manifests itself in inconsistent requirements. To aid requirements engineers in identifying and resolving inconsistent requirements, we investigated the usefulness for manual reviews of two different model-based representation formats for inconsistent requirements; one that represent the inconsistent requirements in separate diagrams and one that represents them integrated into one diagram using annotations. The results from a controlled experiment show that the use of such integrated review diagrams can significantly increase efficiency of manual reviews, without sacrificing effectiveness.
KeywordsRequirements validation Inconsistencies Controlled experiment
This research was partly funded by the German Federal Ministry of Education and Research (grant no. 01IS16043V and grant no. 01IS12005C). We thank Stefan Beck and Arnaud Boyer (Airbus Defence and Space), Jens Höfflinger (Bosch), and Karsten Albers (inchron) for their support regarding the adoption of industrial specifications to fit as experiment material.
- 1.France, R., Rumpe, B.: Model-driven development of complex software: a research roadmap. In: 2007 Future of Software Engineering, pp. 37–54. IEEE Computer Society, Washington, DC (2007)Google Scholar
- 2.Loniewski, G., Insfran, E., Abrahão, S.: A systematic review of the use of requirements engineering techniques in model-driven development. In: Petriu, D.C., Rouquette, N., Haugen, Ø. (eds.) MODELS 2010. LNCS, vol. 6395, pp. 213–227. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-16129-2_16CrossRefGoogle Scholar
- 3.Finkelstein, A., Goedicke, M., Kramer, J., Niskier, C.: ViewPoint oriented software development: methods and viewpoints in requirements engineering. In: Bergstra, J.A., Feijs, L.M.G. (eds.) Algebraic Methods II: Theory, Tools and Applications. LNCS, vol. 490, pp. 29–54. Springer, Heidelberg (1991). doi: 10.1007/3-540-53912-3_17CrossRefGoogle Scholar
- 4.ISO/IEC/IEEE: ISO/IEC/IEEE 29148:2011 - Systems and software engineering – Life cycle processes – Requirements engineering (2011)Google Scholar
- 5.Daun, M., Salmon, A., Weyer, T., Pohl, K.: The impact of students’ skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students. In: Proceedings of International Conference on Evaluation and Assessment in Software Engineering, pp. 29:1–29:6. ACM (2015)Google Scholar
- 7.Denger, C., Ciolkowski, M.: High quality statecharts through tailored, perspective-based inspections. In: 29th EUROMICRO Conference 2003, New Waves in System Architecture, pp. 316–325. IEEE Computer Society (2003)Google Scholar
- 8.Binder, R.V.: Testing Object-Oriented Systems: Models, Patterns, and Tools. Addison-Wesley, Reading (1999)Google Scholar
- 9.Travassos, G., Shull, F., Fredericks, M., Basili, V.R.: Detecting defects in object-oriented designs: using reading techniques to increase software quality. In: Proceedings of the 1999 ACM SIGPLAN Conference on Object-Oriented Programming Systems, Languages & Applications (OOPSLA 1999), pp. 47–56. ACM (1999)CrossRefGoogle Scholar
- 14.Daun, M., Salmon, A., Bandyszak, T., Weyer, T.: Common threats and mitigation strategies in requirements engineering experiments with student participants. In: Daneva, M., Pastor, O. (eds.) REFSQ 2016. LNCS, vol. 9619, pp. 269–285. Springer, Cham (2016). doi: 10.1007/978-3-319-30282-9_19CrossRefGoogle Scholar
- 16.Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects an empirical evaluation. In: Proceedings of the 2nd International Symposium on Empirical Software Engineering and Measurement, ESEM 2008, pp. 288–290. ACM (2008)Google Scholar