Advertisement

Radiological Physics and Technology

, Volume 11, Issue 4, pp 406–414 | Cite as

Verification of modified receiver-operating characteristic software using simulated rating data

  • Junji Shiraishi
  • Daisuke Fukuoka
  • Reimi Iha
  • Haruka Inada
  • Rie Tanaka
  • Takeshi Hara
Article
  • 38 Downloads

Abstract

ROCKIT, which is a receiver-operating characteristic (ROC) curve-fitting software package, was developed by Metz et al. In the early 1990s, it is a very frequently used ROC software throughout the world. In addition to ROCKIT, DBM-MRMC software was developed for multi-reader multi-case analysis of the difference in average area under ROC curves (AUCs). Because this old software cannot run on a PC with Windows 7 or a more recent operating system, we developed new software that employs the same basic algorithms with minor modifications. In this study, we verified our modified software and tested the differences between the index of diagnostic accuracies using simulated rating data. In our simulation model, all data were generated using target AUCs and a binormal parameter b. In ROC curve fitting with simulated rating data, we varied four factors: the total number of case samples, the ratio of positive-to-negative cases, a binormal parameter b, and the preset AUC. To investigate the differences between the statistical test results obtained from our software and the existing software, we generated simulated rating data sets with three levels of case difficulty and three degrees of difference in AUCs obtained from two modalities. As a result of the simulation, the AUCs estimated by the new and existing software were highly correlated (R > 0.98), and there were high agreements (85% or more) in the statistical test results. In conclusion, we believe that our modified software is as capable as the existing software.

Keywords

Receiver-operating characteristic analysis (ROC) Observer study Computer software Simulation data Binormal distribution Multi-reader multi-case 

Notes

Acknowledgements

We gratefully acknowledge the support of a Japanese Society of Radiological Technology (JSRT) research grant (2016 and 2017). This work was also partially supported by JSPS KAKENHI Grant number 15K09898.

Compliance with ethical standards

Ethical approval

This article does not contain any studies with human participants performed, and thus, we have no informed consent from any individuals. In addition, this article does not contain any studies with animals performed.

Conflict of interest

The authors declare that they have no conflict of interest about this article.

References

  1. 1.
    Green DM, Swets JA. Signal detection theory and psychophysics. New York: Wiley; 1966 (reprinted with updated topical bibliographies by Kreiger, New York, 1974).Google Scholar
  2. 2.
    Metz CE, Herman BA, Shen J-H. Maximum likelihood estimation of receiver operating characteristic (ROC) curves from continuously-distributed data. Stat Med. 1998;17:1033–53.CrossRefGoogle Scholar
  3. 3.
    Lusted LB. Logical analysis in Roentgen diagnosis. Radiology. 1960;74:178–93.CrossRefGoogle Scholar
  4. 4.
    Lusted LB. Introduction to medical decision making. Springfield: Charles C Thomas; 1968.Google Scholar
  5. 5.
    Swets JA. Measuring the accuracy of diagnostic systems. Science. 1988;240:1285–93.CrossRefGoogle Scholar
  6. 6.
    Goodenough DJ, Rossmann K, Lusted LB. Radiographic applications of receiver operating characteristic (ROC) curves. Radiology. 1974;110:89–95.CrossRefGoogle Scholar
  7. 7.
    Metz CE. ROC methodology in radiologic imaging. Invest Radiol. 1986;21:720–33.CrossRefGoogle Scholar
  8. 8.
    Metz CE. Basic principles of ROC analysis. Semin Nucl Med. 1978;8:283–98.CrossRefGoogle Scholar
  9. 9.
    Obuchowski NA. Receiver operating characteristic curves and their use in radiology. Radiology. 2003;229:3–8.CrossRefGoogle Scholar
  10. 10.
    ICRU Report 79. Receiver operating characteristic analysis in medical imaging, vol. 8, No.1. Oxford: Oxford University Press; 2008 (J. of the ICRU).Google Scholar
  11. 11.
    Metz CE. ROC analysis in medical imaging: a tutorial review of the literature. Radiol Phys Technol. 2008;1:2–12.CrossRefGoogle Scholar
  12. 12.
    Shiraishi J, Pesce L, Metz CE, Doi K. Experimental design and data analysis in receiver operating characteristic studies: lessons learned from reports in Radiology from 1997 to 2006. Radiology. 2009;253:822–30.CrossRefGoogle Scholar
  13. 13.
    Dorfman DD, Alf E. Maximum likelihood estimation of parameters of signal detection theory and determination of confidence intervals—rating method data. J Math Psychol. 1969;6:487–96.CrossRefGoogle Scholar
  14. 14.
    Metz CE, Pan X. “Proper” binormal ROC curves: theory and maximum-likelihood estimation. J Math Psychol. 1999;43(1):1–33.CrossRefGoogle Scholar
  15. 15.
    Dorfman DD, Berbaum KS, Metz CE. Receiver operating characteristic rating analysis: generalization to the population of readers and patients with the jackknife method. Invest Radiol. 1992;27:723–31.CrossRefGoogle Scholar
  16. 16.
    Metz CE, Roe CA. Dorfman-Berbaum-Metz method for statistical analysis of multireader, multimodality receiver operating characteristic data: validation with computer simulation. Acad Radiol. 1997;4(4):298–303.CrossRefGoogle Scholar
  17. 17.
    Shiraishi J, Fukuoka D, Hara T, Abe H. Basic concepts and development of an all-purpose computer interface for ROC/FROC observer study. Radiol Phys Technol. 2013;6(1):35–41.CrossRefGoogle Scholar
  18. 18.
    Waldrop MM. More than Moore. Nature. 2016;530:144–7.CrossRefGoogle Scholar
  19. 19.
    Metz CE. Receiver operating characteristic analysis: a tool for the quantitative evaluation of observer performance and imaging systems. J Am Coll Radiol. 2006;3:413–22.CrossRefGoogle Scholar
  20. 20.
    Dorfman DD, Berbaum KS, Metz CE, Lenth RV, Hanley JA, Dagga HA. Proper receiver operating characteristic analysis: the Bigamma model. Acad Radiol. 1996;4:138–49.CrossRefGoogle Scholar
  21. 21.
    Roe CA, Metz CE. Dorfman-Berbaum-Metz method for statistical analysis of multireader, multimodality receiver operating characteristic data: validation with computer simulation. Acad Radiol. 1997;4:298–303.CrossRefGoogle Scholar
  22. 22.
    Roe CA, Metz CE. Variance-component modeling in the analysis of receiver operating characteristic index estimates. Acad Radiol. 1997;4:587–600.CrossRefGoogle Scholar
  23. 23.
    Pan X, Metz CE. The “Proper” binormal model: parametric receiver operating characteristic curve estimation with degenerate data. Acad Radiol. 1997;4:380–9.CrossRefGoogle Scholar
  24. 24.
    Wagner RF, Beiden SV, Metz CE. Continuous versus categorical data for ROC analysis: some quantitative considerations. Acad Radiol. 2001;8(4):328–34.CrossRefGoogle Scholar
  25. 25.
    Pesce LL, Horsch K, Drukker K, Metz CE. Semiparametric estimation of the relationship between ROC operating points and the test-result scale: application to the proper binormal model. Acad Radiol. 2011;18:1537–48.CrossRefGoogle Scholar
  26. 26.
    Hillis SL, Berbaum KS. Monte Carlo validation of the Dorfman-Berbaum-Metz method using normalized pseudo values and less data-based model simplification. Acad Radiol. 2005;12:1534–41.CrossRefGoogle Scholar
  27. 27.
    Hillis SL, Berbaum KS, Metz CE. Recent developments in the Dorfman-Berbaum-Metz procedure for multireader ROC study analysis. Acad Radiol. 2008;15:647–61.CrossRefGoogle Scholar
  28. 28.
    Shiraishi J, Katsuragawa S, Ikezoe J, Matsumoto T, Kobayashi T, Komatsu K, Matsui M, Fujita H, Kodera Y, Doi K. Development of a digital image database for chest radiographs with and without a lung nodule: receiver operating characteristic analysis of radiologists’ detection of pulmonary nodules. AJR. 2000;174:71–4.CrossRefGoogle Scholar

Copyright information

© Japanese Society of Radiological Technology and Japan Society of Medical Physics 2018

Authors and Affiliations

  1. 1.Faculty of Life SciencesKumamoto UniversityKumamotoJapan
  2. 2.Faculty of EducationGifu UniversityGifuJapan
  3. 3.School of Health SciencesKumamoto UniversityKumamotoJapan
  4. 4.College of Medical, Pharmaceutical and Health SciencesKanazawa UniversityKanazawaJapan
  5. 5.Faculty of EngineeringGifu UniversityGifuJapan

Personalised recommendations