Eliminating Over-Confidence in Software Development Effort Estimates
Previous studies show that software development projects strongly underestimate the uncertainty of their effort estimates. This overconfidence in estimation accuracy may lead to poor project planning and execution. In this paper, we investigate whether the use of estimation error information from previous projects improves the realism of uncertainty assessments. As far as we know, there have been no empirical software studies on this topic before. Nineteen realistically composed estimation teams provided minimum-maximum effort intervals for the same software project. Ten of the teams (Group A) received no instructions about how to complete the uncertainty assessment process. The remaining nine teams (Group B) were instructed to apply a history-based uncertainty assessment process. The main results is that software professionals seem to willing to consider the error of previous effort estimates as relevant information when assessing the minimum effort of a new project, but not so much when assessing the maximum effort!
KeywordsActual Effort Maximum Effort Minimum Effort Effort Estimate Similar Project
Unable to display preview. Download preview PDF.
- 2.Jørgensen, M., Teigen, K.H.: Uncertainty Intervals versus Interval Uncertainty: An Alternative Method for Eliciting Effort Prediction Intervals in Software Development Projects. In: International conference on Project Management (ProMAC), Singapore, pp. 343–352 (2002)Google Scholar
- 3.Jørgensen, M., Teigen, K.H., Moløkken, K.: Better Sure than Safe? Overconfidence in Judgment Based Software Development Effort Prediction Intervals. Journal of System and Software (2004) (to appear)Google Scholar
- 4.Jørgensen, M.: Top-Down and Bottom-Up Expert Estimation of Software Development Effort 46(1), 3–16 (2004)Google Scholar
- 5.Alpert, M., Raiffa, H.: A progress report on the training of probability assessors. In: Tversky, A. (ed.) Judgment under uncertainty: Heuristics and biases, pp. 294–305. Cambridge University Press, Cambridge (1982)Google Scholar
- 6.Kahnemann, D., Slovic, P., Tversky, A.: Judgement under uncertainty: Heuristics and biases. Cambridge University Press, Cambridge (1982)Google Scholar
- 9.Lichtenstein, S., Fischhoff, B.: Do those who know more also know more about how much they know? Organizational Behaviour and Human Decision Processes 20(2), 159–183 (1977)Google Scholar
- 10.Arkes, H.R.: Overconfidence in judgmental forecasting. In: Armstrong, J.S. (ed.) Principles of forecasting: A handbook for researchers and practitioners, pp. 495–515. Kluwer Academic Publishers, Boston (2001)Google Scholar
- 11.Kahneman, D., Tversky, A.: Variants of uncertainty. In: Kahneman, D., Slovic, P., Tversky, A. (eds.) Judgment under uncertainty: Heuristics and biases, pp. 509–520. Cambridge University Press, Cambridge (1982)Google Scholar
- 15.Jørgensen, M., Sjøberg, D.I.K.: Impact of effort estimates on software project work. Information and Software Technology 43(15), 939–948 (2001), 2001Google Scholar