Development of a Web Based Framework to Objectively Compare and Evaluate Software Solutions

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 787)


We created a web based framework to objectively evaluate different software solutions using different approaches to the same problem. Solutions can either be single algorithms, software modules or entire programs. The evaluation strongly relies on the human component. In this paper we describe the conceptual structure as well as the evaluation of the first field test of the developed framework by utilizing the implementation of two distinct georouting algorithms as well as their validation with test users. No modification to the tested software needs to be done, as the state of each software component is saved and synchronized by the framework. The aim of the investigation of our implementation is to gather usage data, which then allows us to derive proposals for improvements upon the area of usability and user experience and to identify properties that increases the productivity of users as well as finding out about bottlenecks. The framework is validated by assessing the collected metrics and the user feedback produced by the test, as distinctions in usage, user experience and usability between the two test algorithms were identified.


Algorithm comparison Evaluation framework Human factors engineering Systems engineering User centered Web technologies 


  1. 1.
    Gray, A.R., MacDonell, S.G., Shepperd, M.J.: Factors systematically associated with errors in subjective estimates of software development effort: the stability of expert judgment. In: Proceedings of Sixth International IEEE Software Metrics Symposium, pp. 216–227 (1999)Google Scholar
  2. 2.
    Moody, D.: The “physics” of notations: toward a scientific basis for constructing visual notations in software engineering. IEEE Trans. Softw. Eng. 35(6), 756–779 (2009)CrossRefGoogle Scholar
  3. 3.
    Albrecht, A.J., Gaffney, J.E.: Software function, source lines of code, and development effort prediction: a software science validation. IEEE Trans. Softw. Eng. SE-9(6), 639–648 (1983)CrossRefGoogle Scholar
  4. 4.
    Figl, K.: Deutschsprachige Fragebögen zur Usability-Evaluation im Vergleich. Zeitschrift für Arbeitswissenschaft 4, 321–337 (2010)Google Scholar
  5. 5.
    International Organization for Standardization: Ergonomics of human system interaction – Part 210: Human-centered design for interactive systems (formerly known as 13407). ISO F±DIS 9241-210:2009 (2009)Google Scholar
  6. 6.
    International Organization for Standardization: Ergonomics of human system interaction – Usability methods supporting human-centered design. ISO/TR 16982:2002 (2005)Google Scholar
  7. 7.
    Cattell, R.: Scalable SQL and NoSQL data stores. ACM SIGMOD Rec. 39(4), 12–27 (2011)CrossRefGoogle Scholar
  8. 8.
    Schimanski, S., et al.: EmoTal – Nutzerzentrierte Elektromobilität Wuppertal. In: Dienstleistungen als Erfolgsfaktor für Elektromobilität, pp. 120–127. Fraunhofer Verlag, Stuttgart (2017)Google Scholar
  9. 9.
    Rivers, W.H.R., Webber, H.N.: The action of caffeine on the capacity of muscular work. J. Physiol. 36, 33–47 (1907)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Human Factors EngineeringBergische Universität WuppertalWuppertalGermany

Personalised recommendations