Skip to main content

Expert Sourcing to Support the Identification of Model Elements in System Descriptions

  • Conference paper
  • First Online:
  • 727 Accesses

Part of the book series: Lecture Notes in Business Information Processing ((LNBIP,volume 302))

Abstract

Context. Expert sourcing is a novel approach to support quality assurance: it relies on methods and tooling from crowdsourcing research to split model quality assurance tasks and parallelize task execution across several expert users. Typical quality assurance tasks focus on checking an inspection object, e.g., a model, towards a reference document, e.g., a requirements specification, that is considered to be correct. For example, given a text-based system description and a corresponding model such as an Extended Entity Relationship (EER) diagram, experts are guided towards inspecting the model based on so-called expected model elements (EMEs). EMEs are entities, attributes and relations that appear in text and are reflected by the corresponding model. In common inspection tasks, EMEs are not explicitly expressed but implicitly available via textual descriptions. Thus, a main improvement is to make EMEs explicit by using crowdsourcing mechanisms to drive model quality assurance among experts. Objective and Method. In this paper, we investigate the effectiveness of identifying the EMEs through expert sourcing. To that end, we perform a feasibility study in which we compare EMEs identified through expert sourcing with EMEs provided by a task owner who has a deep knowledge of the entire system specification text. Conclusions. Results of the data analysis show that the effectiveness of the crowdsourcing-style EME acquisition is influenced by the complexity of these EMEs: entity EMEs can be harvested with high recall and precision, but the lexical and semantic variations of attribute EMEs hamper their automatic aggregation and reaching consensus (these EMEs are harvested with high precisions but limited recall). Based on these lessons learned we propose a new task design for expert sourcing EMEs.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    www.crowdflower.com.

  2. 2.

    www.mturk.com.

References

  1. Aurum, A., Petersson, H., Wohlin, C.: State-of-the-art: software inspection after 25 years. J. Softw. Test. Verif. Reliab. 12(3), 133–154 (2002)

    Article  Google Scholar 

  2. André, P., Kittur, A., Dow, S.: Crowd synthesis: extracting categories and clusters from complex data. In: Proceedings of the Conference on Computer Supported Cooperative Work (CSCW), pp. 989–998 (2014)

    Google Scholar 

  3. Chilton, L.B., Little, G., Edge, D., Weld, D.S., Landay, J.A.: Cascade: crowdsourcing taxonomy creation. In: Proceedings of the Conference on Human Factors in Computing Systems (CHI), pp. 1999–2008 (2013)

    Google Scholar 

  4. LaToza, T.D., van der Hoek, A.: Crowdsourcing in software engineering: models. IEEE Softw. Motiv. Chall. 33(1), 74–80 (2016)

    Article  Google Scholar 

  5. Mao, K., Capra, L., Harman, M., Jia, Y.: A survey of the use of crowdsourcing in software engineering. J. Syst. Softw. 126, 57–84 (2016)

    Article  Google Scholar 

  6. NASA: Software Formal Inspection Standards, NASA-STD-8739.9, NASA (2013)

    Google Scholar 

  7. Poesio, M., Chamberlain, J., Kruschwitz, U., Robaldo, L., Ducceschi, L.: Phrase detectives: utilizing collective intelligence for internet-scale language resource creation. ACM Trans. Interact. Intell. Syst. 3(1), 44p. (2013)

    Google Scholar 

  8. Quinn, A., Bederson, B.: Human computation: a survey and taxonomy of a growing field. In: Proceedings of Human Factors in Computing Systems (CHI), pp. 1403–1412 (2011)

    Google Scholar 

  9. Winkler, D., Sabou, M., Petrovic, S., Carneiro, G., Kalinowski, M., Biffl, S.: Investigating model quality assurance with a distributed and scalable review process. In: Proceedings of the 20th Ibero-American Conference on Software Engineering, Experimental Software Engineering (ESELAW) Track. Springer, Buenos Aires, Argentina (2017)

    Google Scholar 

  10. Winkler, D., Sabou, M., Petrovic, S., Carneiro, G., Kalinowski, M., Biffl, S.: Improving model inspection processes with crowdsourcing: findings from a controlled experiment. In: Stolfa, J., Stolfa, S., O’Connor, R.V., Messnarz, R. (eds.) EuroSPI 2017. CCIS, vol. 748, pp. 125–137. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-64218-5_10

    Chapter  Google Scholar 

  11. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wessl, A.: Experimentation in Software Engineering. Springer, Heidelberg (2012)

    Book  MATH  Google Scholar 

Download references

Acknowledgments

We would like to thank the participants of the software quality course at Vienna University of Technology in the winter term 2016/2017 for participating in the study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marta Sabou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sabou, M., Winkler, D., Petrovic, S. (2018). Expert Sourcing to Support the Identification of Model Elements in System Descriptions. In: Winkler, D., Biffl, S., Bergsmann, J. (eds) Software Quality: Methods and Tools for Better Software and Systems. SWQD 2018. Lecture Notes in Business Information Processing, vol 302. Springer, Cham. https://doi.org/10.1007/978-3-319-71440-0_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-71440-0_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-71439-4

  • Online ISBN: 978-3-319-71440-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics