Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Künstliche Intelligenz in der Hybridbildgebung

Artificial intelligence in hybrid imaging

Zusammenfassung

Klinisches Problem

Hybridbildgebung ermöglicht es durch die Kombination anatomischer und molekularer Informationen, zellulären Metabolismus ortsgenau darzustellen. Die Fortschritte in der künstlichen Intelligenz (KI) bieten neue Methoden zur Verarbeitung und Auswertung dieser Daten.

Methodische Innovationen

Diese Übersichtsarbeit fasst aktuelle Entwicklungen und Anwendungen von KI-Methoden in der Hybridbildgebung zusammen. Es werden sowohl Anwendungen in der Bildverarbeitung als auch Methoden zur krankheitsbezogenen Auswertung vorgestellt und diskutiert.

Material und Methoden

Die Arbeit beruht auf einer selektiven Literaturrecherche in den Suchmaschinen PubMed und arXiv.

Bewertung

Aktuell gibt es nur wenige KI-Anwendungen, die hybride Bilddaten verwenden, und noch keine Anwendungen, die im klinischen Alltag etabliert sind. Obwohl sich erste vielversprechende Ansätze zeigen, müssen diese noch prospektiv evaluiert werden. In Zukunft werden KI-Anwendungen Radiologen und Nuklearmediziner bei Diagnostik und Therapie unterstützen.

Abstract

Clinical issue

Hybrid imaging enables the precise visualization of cellular metabolism by combining anatomical and metabolic information. Advances in artificial intelligence (AI) offer new methods for processing and evaluating this data.

Methodological innovations

This review summarizes current developments and applications of AI methods in hybrid imaging. Applications in image processing as well as methods for disease-related evaluation are presented and discussed.

Materials and methods

This article is based on a selective literature search with the search engines PubMed and arXiv.

Assessment

Currently, there are only a few AI applications using hybrid imaging data and no applications are established in clinical routine yet. Although the first promising approaches are emerging, they still need to be evaluated prospectively. In the future, AI applications will support radiologists and nuclear medicine radiologists in diagnosis and therapy.

This is a preview of subscription content, log in to check access.

Abb. 1

Literatur

  1. 1.

    Bradshaw TJ, Zhao G, Jang H et al (2018) Feasibility of deep learning–based PET/MR attenuation correction in the pelvis using only diagnostic MR images. Tomography 4:138–147. https://doi.org/10.18383/j.tom.2018.00016

  2. 2.

    Catana C (2018) The dawn of a new era in low-dose PET imaging. Radiology 290:657–658. https://doi.org/10.1148/radiol.2018182573

  3. 3.

    Chen KT, Gong E, de Carvalho Macruz FB et al (2018) Ultra–low-dose 18F-florbetaben amyloid PET imaging using deep learning with multi-contrast MRI inputs. Radiology 290:649–656. https://doi.org/10.1148/radiol.2018180940

  4. 4.

    Cui J, Gong K, Guo N et al (2019) PET image denoising using unsupervised deep learning. Eur J Nucl Med Mol Imaging 46:2780–2789. https://doi.org/10.1007/s00259-019-04468-4

  5. 5.

    Dong X, Wang T, Lei Y et al (2019) Synthetic CT generation from non-attenuation corrected PET images for whole-body PET imaging. Phys Med Biol 64:215016. https://doi.org/10.1088/1361-6560/ab4eb7

  6. 6.

    Fendler WP, Calais J, Eiber M et al (2019) Assessment of 68ga-PSMA-11 PET accuracy in localizing recurrent prostate cancer: a prospective single-arm clinical trial. JAMA Oncol 5:856–863. https://doi.org/10.1001/jamaoncol.2019.0096

  7. 7.

    Fendler WP, Weber M, Iravani A et al (2019) Prostate-specific membrane antigen ligand positron-emission tomography in men with nonmetastatic castration-resistant prostate cancer. Clin Cancer Res. https://doi.org/10.1158/1078-0432.CCR-19-1050

  8. 8.

    Gafita A, Bieth M, Krönke M et al (2019) qPSMA: semiautomatic software for whole-body tumor burden assessment in prostate cancer using 68ga-PSMA11 PET/CT. J Nucl Med 60:1277–1283. https://doi.org/10.2967/jnumed.118.224055

  9. 9.

    Gao X, Chu C, Li Y et al (2015) The method and efficacy of support vector machine classifiers based on texture features and multi-resolution histogram from 18F-FDG PET-CT images for the evaluation of mediastinal lymph nodes in patients with lung cancer. Eur J Radiol 84:312–317. https://doi.org/10.1016/j.ejrad.2014.11.006

  10. 10.

    Gsaxner C, Roth PM, Wallner J, Egger J (2019) Exploit fully automatic low-level segmented PET data for training high-level deep learning algorithms for the corresponding CT data. PLoS ONE 14:e212550. https://doi.org/10.1371/journal.pone.0212550

  11. 11.

    Guo Z, Li X, Huang H et al (2018) Medical image segmentation based on multi-modal convolutional neural network: study on image fusion schemes. 2018 IEEE 15th Int Symp Biomed Imaging ISBI 2018:903–907. https://doi.org/10.1109/ISBI.2018.8363717

  12. 12.

    Haubold J, Demircioglu A, Gratz M et al (2019) Non-invasive tumor decoding and phenotyping of cerebral gliomas utilizing multiparametric 18F-FET PET-MRI and MR fingerprinting. Eur J Nucl Med Mol Imaging. https://doi.org/10.1007/s00259-019-04602-2

  13. 13.

    Huang B, Chen Z, Wu P‑M et al (2018) Fully automated delineation of gross tumor volume for head and neck cancer on PET-CT using deep learning: a dual-center study. Contrast Media Mol Imaging 2018:1–12. https://doi.org/10.1155/2018/8923028

  14. 14.

    Huang Y, Xu J, Zhou Y et al (2019) Diagnosis of Alzheimer’s disease via multi-modality 3D convolutional neural network. Front Neurosci. https://doi.org/10.3389/fnins.2019.00509

  15. 15.

    Kirienko M, Sollini M, Silvestri G et al (2018) Convolutional neural networks promising in lung cancer T‑parameter assessment on baseline FDG-PET/CT. Contrast Media Mol Imaging 2018:1–6. https://doi.org/10.1155/2018/1382309

  16. 16.

    Kläser K, Varsavsky T, Markiewicz P et al (2019) Improved MR to CT synthesis for PET/MR attenuation correction using imitation learning. In: Burgos N, Gooya A, Svoboda D (Hrsg) Simul. synth. med. imaging. Springer, Cham, S 13–21

  17. 17.

    Kleesiek J, Murray JM, Strack C et al (2019) Wie funktioniert maschinelles Lernen? Radiologe. https://doi.org/10.1007/s00117-019-00616-x

  18. 18.

    Kleesiek J, Petersen J, Döring M et al (2016) Virtual raters for reproducible and objective assessments in radiology. Sci Rep. https://doi.org/10.1038/srep25007

  19. 19.

    Klyuzhin IS, Cheng J‑C, Bevington C, Sossi V (2019) Use of a tracer-specific deep artificial neural net to denoise dynamic PET images. IEEE Trans Med Imaging. https://doi.org/10.1109/TMI.2019.2927199

  20. 20.

    Ladefoged CN, Marner L, Hindsholm A et al (2019) Deep learning based attenuation correction of PET/MRI in pediatric brain tumor patients: evaluation in a clinical setting. Front Neurosci. https://doi.org/10.3389/fnins.2018.01005

  21. 21.

    Li Y, Schiepers C, Lake R et al (2012) Clinical utility of (18)F-fluoride PET/CT in benign and malignant bone diseases. Bone 50:128–139. https://doi.org/10.1016/j.bone.2011.09.053

  22. 22.

    Liu C‑C, Qi J (2019) Higher SNR PET image prediction using a deep learning model and MRI image. Phys Med Biol 64:115004. https://doi.org/10.1088/1361-6560/ab0dc0

  23. 23.

    Liu F, Jang H, Kijowski R et al (2018) Deep learning MR imaging–based attenuation correction for PET/MR imaging. Radiology 286:676–684. https://doi.org/10.1148/radiol.2017170700

  24. 24.

    Liu F, Jang H, Kijowski R et al (2018) A deep learning approach for 18F-FDG PET attenuation correction. EJNMMI Phys 5:24. https://doi.org/10.1186/s40658-018-0225-8

  25. 25.

    Lu D, Popuri K, Ding GW et al (2018) Multimodal and multiscale deep neural networks for the early diagnosis of alzheimer’s disease using structural MR and FDG-PET images. Sci Rep 8:5697. https://doi.org/10.1038/s41598-018-22871-z

  26. 26.

    Murray JM, Kaissis G, Braren R, Kleesiek J (2019) Wie funktioniert Radiomics? Radiologe. https://doi.org/10.1007/s00117-019-00617-w

  27. 27.

    Papp L, Spielvogel CP, Rausch I et al (2018) Personalizing medicine through hybrid imaging and medical big data analysis. Front Phys 6:51. https://doi.org/10.3389/fphy.2018.00051

  28. 28.

    Perk T, Bradshaw T, Chen S et al (2018) Automated classification of benign and malignant lesions in 18F-NaF PET/CT images using machine learning. Phys Med Biol 63:225019. https://doi.org/10.1088/1361-6560/aaebd0

  29. 29.

    Petersen RC, Roberts RO, Knopman DS et al (2009) Mild cognitive impairment: ten years later. Arch Neurol 66:1447–1455. https://doi.org/10.1001/archneurol.2009.266

  30. 30.

    Shiri I, Ghafarian P, Geramifar P et al (2019) Direct attenuation correction of brain PET images using only emission data via a deep convolutional encoder-decoder (deep-DAC). Eur Radiol 29:6867–6879. https://doi.org/10.1007/s00330-019-06229-1

  31. 31.

    Tan Y‑L, Kim H, Lee S et al (2018) Quantitative surface analysis of combined MRI and PET enhances detection of focal cortical dysplasias. Neuroimage 166:10–18. https://doi.org/10.1016/j.neuroimage.2017.10.065

  32. 32.

    Tian C, Xu Y, Fei L, Yan K (2018) Deep learning for image denoising: a survey

  33. 33.

    Wang H, Zhou Z, Li Y et al (2017) Comparison of machine learning methods for classifying mediastinal lymph node metastasis of non-small cell lung cancer from 18F-FDG PET/CT images. EJNMMI Res 7:11. https://doi.org/10.1186/s13550-017-0260-9

  34. 34.

    Wang Y, Zhou L, Wang L et al (2018) Locality adaptive multi-modality GANs for high-quality PET image synthesis. In: Frangi AF, Schnabel JA, Davatzikos C et al (Hrsg) Med. image comput. comput. assist. interv. – MICCAI 2018. Springer, Cham, S 329–337

  35. 35.

    Xiang L, Qiao Y, Nie D et al (2017) Deep auto-context convolutional neural networks for standard-dose PET image estimation from low-dose PET/MRI. Neurocomputing 267:406–416. https://doi.org/10.1016/j.neucom.2017.06.048

  36. 36.

    Xu J, Gong E, Pauly J, Zaharchuk G (2017) 200x low-dose PET reconstruction using deep learning

  37. 37.

    Xu L, Tetteh G, Lipkova J et al (2018) Automated whole-body bone lesion detection for multiple myeloma on 68ga-pentixafor PET/CT imaging using deep learning methods. Contrast Media Mol Imaging 2018:2391925. https://doi.org/10.1155/2018/2391925

  38. 38.

    Zhao X, Li L, Lu W, Tan S (2018) Tumor co-segmentation in PET/CT using multi-modality fully convolutional neural network. Phys Med Biol 64:15011. https://doi.org/10.1088/1361-6560/aaf44b

  39. 39.

    Zhong Z, Kim Y, Plichta K et al (2019) Simultaneous cosegmentation of tumors in PET-CT images using deep fully convolutional networks. Med Phys 46:619–633. https://doi.org/10.1002/mp.13331

  40. 40.

    ADNI Alzheimer’s Disease Neuroimaging Initiative (2017) Webpräsenz. http://adni.loni.usc.edu/. Zugegriffen: 22. Dez. 2019

Download references

Author information

Correspondence to Dr. Dr. Jens Kleesiek.

Ethics declarations

Interessenkonflikt

C. Strack, R. Seifert und J. Kleesiek geben an, dass kein Interessenkonflikt besteht.

Für diesen Beitrag wurden von den Autoren keine Studien an Menschen oder Tieren durchgeführt. Für die aufgeführten Studien gelten die jeweils dort angegebenen ethischen Richtlinien.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Strack, C., Seifert, R. & Kleesiek, J. Künstliche Intelligenz in der Hybridbildgebung. Radiologe (2020). https://doi.org/10.1007/s00117-020-00646-w

Download citation

Schlüsselwörter

  • Diagnostische Bildgebung
  • Tiefe neuronale Netze
  • Maschinelles Lernen
  • Deep Learning
  • Zellulärer Metabolismus

Keywords

  • Diagnostic imaging
  • Deep neuronal networks
  • Machine learning
  • Deep learning
  • Cellular metabolism