European Radiology

, Volume 28, Issue 11, pp 4783–4791 | Cite as

Computer-based self-training for CT colonography with and without CAD

  • Lapo SaliEmail author
  • Silvia Delsanto
  • Daniela Sacchetto
  • Loredana Correale
  • Massimo Falchini
  • Andrea Ferraris
  • Giovanni Gandini
  • Giulia Grazzini
  • Franco Iafrate
  • Gabriella Iussich
  • Lia Morra
  • Andrea Laghi
  • Mario Mascalchi
  • Daniele Regge



To determine whether (1) computer-based self-training for CT colonography (CTC) improves interpretation performance of novice readers; (2) computer-aided detection (CAD) use during training affects learning.


Institutional review board approval and patients’ informed consent were obtained for all cases included in this study. Twenty readers (17 radiology residents, 3 radiologists) with no experience in CTC interpretation were recruited in three centres. After an introductory course, readers performed a baseline assessment test (37 cases) using CAD as second reader. Then they were randomized (1:1) to perform either a computer-based self-training (150 cases verified at colonoscopy) with CAD as second reader or the same training without CAD. The same assessment test was repeated after completion of the training programs. Main outcome was per lesion sensitivity (≥ 6 mm). A generalized estimating equation model was applied to evaluate readers’ performance and the impact of CAD use during training.


After training, there was a significant improvement in average per lesion sensitivity in the unassisted phase, from 74% (356/480) to 83% (396/480) (p < 0.001), and in the CAD-assisted phase, from 83% (399/480) to 87% (417/480) (p = 0.021), but not in average per patient sensitivity, from 93% (390/420) to 94% (395/420) (p = 0.41), and specificity, from 81% (260/320) to 86% (276/320) (p = 0.15). No significant effect of CAD use during training was observed on per patient sensitivity and specificity, nor on per lesion sensitivity.


A computer-based self-training program for CTC improves readers’ per lesion sensitivity. CAD as second reader does not have a significant impact on learning if used during training.

Key Points

• Computer-based self-training for CT colonography improves per lesion sensitivity of novice readers.

• Self-training program does not increase per patient specificity of novice readers.

• CAD used during training does not have significant impact on learning.


CT colonography Virtual colonoscopy Learning Education 







Computer-aided detection


Confidence interval


CT colonography


European Society of Gastrointestinal and Abdominal Radiology


Generalized estimating equation


Odds ratio



im3D (Turin, Italy) provided the CTC training software, six workstations with CAD and technical support for the study.

We acknowledge the CTC readers of the study who are radiologists and radiology residents from Radiology Units of Florence, Rome and Turin: Lina Bartolini, Rosanna Candreva, Federica Ciolina, Giacomo Gabbani, Marco Gatti, Angela Grasso, Maria Luisa Grognardi, Nicholas Landini, Simone Liberali, Viorica Maldur, Simona Martinello, Antonella Masserelli, Maria Antonietta Napoli, Vincenzo Noce, Giulia Scarpini, Giulia Schivazappa, Gian Giacomo Taliani, Virginia Vegni, Andrea Wlderk, Stefania Zuccherelli.


The authors state that this work has not received any funding.

Compliance with ethical standards


The scientific guarantor of this publication is Professor Daniele Regge.

Conflict of interest

Four authors of this manuscript (Loredanda Correale, Silvia Delsanto, Lia Morra, Daniela Sacchetto) declare relationships with the following company: im3D, Turin, Italy.

All other authors of this manuscript declare no relationships with any companies whose products or services may be related to the subject matter of the article.

Statistics and biometry

One of the authors has significant statistical expertise.

Informed consent

Written informed consent was obtained from all subjects in this study.

Ethical approval

Institutional review board approval was obtained.

Study subjects or cohorts overlap

CTC cases for the assessment tests in this study were extracted from a previously published study (Iussich G, et al. Computer-aided detection for computed tomographic colonography screening: a prospective comparison of a double-reading paradigm with first-reader computer-aided detection against second-reader computer-aided detection. Invest Radiol. 2014;49:173–182).



pmulticentre study

Supplementary material

330_2018_5480_MOESM1_ESM.docx (31 kb)
ESM 1 (DOCX 30 kb)
330_2018_5480_Fig5_ESM.gif (188 kb)

(GIF 187 kb)

330_2018_5480_MOESM2_ESM.tif (6.6 mb)
High resolution image (TIF 6750 kb)


  1. 1.
    Fletcher JG, Chen M-H, Herman BA et al (2010) Can radiologist training and testing ensure high performance in CT colonography? Lessons from the National CT Colonography Trial. AJR Am J Roentgenol 195:117–125CrossRefGoogle Scholar
  2. 2.
    Taylor SA, Halligan S, Burling D et al (2004) CT colonography: effect of experience and training on reader performance. Eur Radiol 14:1025–1033CrossRefGoogle Scholar
  3. 3.
    European Society of Gastrointestinal and Abdominal Radiology CT Colonography Group Investigators (2007) Effect of directed training on reader performance for CT colonography: multicenter study. Radiology 242:152–161CrossRefGoogle Scholar
  4. 4.
    Taylor PM (2007) A review of research into the development of radiologic expertise: implications for computer-based training. Acad Radiol 14:1252–1263CrossRefGoogle Scholar
  5. 5.
    Dachman AH, Kelly KB, Zintsmaster MP et al (2008) Formative evaluation of standardized training for CT colonographic image interpretation by novice readers. Radiology 249:167–177CrossRefGoogle Scholar
  6. 6.
    Liedenbaum MH, Bipat S, Bossuyt PMM et al (2011) Evaluation of a standardized CT colonography training program for novice readers. Radiology 258:477–487CrossRefGoogle Scholar
  7. 7.
    Regge D, Della Monica P, Galatola G et al (2013) Efficacy of computer-aided detection as a second reader for 6–9-mm lesions at CT colonography: multicenter prospective trial. Radiology 266:168–176CrossRefGoogle Scholar
  8. 8.
    Mang T, Bogoni L, Anand VX et al (2014) CT colonography: effect of computer-aided detection of colonic polyps as a second and concurrent reader for general radiologists with moderate experience in CT colonography. Eur Radiol 24:1466–1476CrossRefGoogle Scholar
  9. 9.
    Neri E, Halligan S, Hellström M et al (2013) The second ESGAR consensus statement on CT colonography. Eur Radiol 23:720–729CrossRefGoogle Scholar
  10. 10.
    Neri E, Faggioni L, Regge D et al (2011) CT colonography: role of a second reader CAD paradigm in the initial training of radiologists. Eur J Radiol 80:303–809CrossRefGoogle Scholar
  11. 11.
    Boone D, Mallett S, McQuillan J et al (2015) Assessment of the incremental benefit of computer-aided detection (CAD) for interpretation of CT colonography by experienced and inexperienced readers. PLoS One 10:e0136624CrossRefGoogle Scholar
  12. 12.
    Fisichella VA, Jäderling F, Horvath S et al (2009) Computer-aided detection (CAD) as a second reader using perspective filet view at CT colonography: effect on performance of inexperienced readers. Clin Radiol 64:972–982CrossRefGoogle Scholar
  13. 13.
    Baker ME, Bogoni L, Obuchowski NA et al (2007) Computer-aided detection of colorectal polyps: can it improve sensitivity of less-experienced readers? Preliminary findings. Radiology 245:140–149CrossRefGoogle Scholar
  14. 14.
    Pickhardt PJ (2004) Differential diagnosis of polypoid lesions \n at CT colonography (virtual colonoscopy). Radiographics 24:1535–1556CrossRefGoogle Scholar
  15. 15.
    Mang T, Maier A, Plank C et al (2007) Pitfalls in multi-detector row CT colonography: a systematic approach. Radiographics 27:431–454CrossRefGoogle Scholar
  16. 16.
    Delsanto, S, Morra, L, Campanella et al (2008) Computer aided detection of polyps in virtual colonoscopy with sameday fecal tagging. Medical Imaging, San Diego, California, 16–21 Feb 2008Google Scholar
  17. 17.
    Iussich G, Correale L, Senore C et al (2013) CT colonography: preliminary assessment of a double-read paradigm that uses computer-aided detection as the first reader. Radiology 268:743–751CrossRefGoogle Scholar
  18. 18.
    Yan J (2002) geepack: yet another package for generalized estimating equations. R-News 2:12–14Google Scholar
  19. 19.
    Yan J, Fine J (2004) Estimating equations for association structures. Stat Med 23:859–874CrossRefGoogle Scholar
  20. 20.
    Gallas BD, Bandos A, Samuelson FW, Wagner RF (2009) A framework for random-effects ROC analysis: biases with the bootstrap and other variance estimators. Commun Stat - Theory Methods 38:2586–2603CrossRefGoogle Scholar
  21. 21.
    Obuchowski NA, Gallas BD, Hillis SL (2012) Multi-reader ROC studies with split-plot designs. Acad Radiol 19:1508–1517CrossRefGoogle Scholar
  22. 22.
    Tonidandel S, Overall JE, Smith F (2004) Use of resampling to select among alternative error structure specifications for GLMM analyses of repeated measurements. Int J Methods Psychiatr Res 13:24–33CrossRefGoogle Scholar
  23. 23.
    Obuchowski NA, Hillis SL (2011) Sample size tables for computer-aided detection studies. AJR Am J Roentgenol 197:W821–W828CrossRefGoogle Scholar
  24. 24.
    Auffermann WF, Henry TS, Little BP, Tigges S, Tridandapani S (2015) Simulation for teaching and assessment of nodule perception on chest radiography in nonradiology health care trainees. J Am Coll Radiol 12:1215–1222CrossRefGoogle Scholar
  25. 25.
    Zafar S, Safdar S, Zafar AN (2014) Evaluation of use of e-Learning in undergraduate radiology education: a review. Eur J Radiol 83:2277–2287CrossRefGoogle Scholar
  26. 26.
    ACR (2014) ACR practice guideline for the performance of computed tomography (CT) colonography in adults. American College of Radiology. Accessed 13 Nov 2017

Copyright information

© European Society of Radiology 2018

Authors and Affiliations

  • Lapo Sali
    • 1
    Email author
  • Silvia Delsanto
    • 2
  • Daniela Sacchetto
    • 2
  • Loredana Correale
    • 2
  • Massimo Falchini
    • 1
  • Andrea Ferraris
    • 3
  • Giovanni Gandini
    • 3
  • Giulia Grazzini
    • 1
  • Franco Iafrate
    • 4
  • Gabriella Iussich
    • 5
  • Lia Morra
    • 2
  • Andrea Laghi
    • 6
  • Mario Mascalchi
    • 1
  • Daniele Regge
    • 3
    • 7
  1. 1.Department of Biomedical, Experimental and Clinical Sciences “Mario Serio”University of FlorenceFlorenceItaly
  2. 2.im3D S.p.A.TurinItaly
  3. 3.Department of Surgical ScienceUniversity of TurinTurinItaly
  4. 4.Radiology Unit, Department of Radiological Sciences, Oncology and PathologyUniversity of Rome “Sapienza”RomeItaly
  5. 5.Radiology UnitSant’Anna HospitalSorengoSwitzerland
  6. 6.Department of Radiological Sciences, Oncology and PathologyUniversity of Rome “Sapienza”, Sant’Andrea University HospitalRomeItaly
  7. 7.Imaging UnitCandiolo Cancer Institute FPO-IRCCSCandiolo, TurinItaly

Personalised recommendations