Zusammenfassung
Interactive image segmentation bears the advantage of correctional updates to the current segmentation mask when compared to fully automated systems. Especially in the field of inter-operative medical image processing of a single patient, where a high accuracy is an uncompromisable necessity, a human operator guiding a system towards an optimal segmentation result is a time-efficient constellation benefiting the patient. There are recent categories of neural networks which can incorporate human-computer interaction (HCI) data as additional input for segmentation. In this work, we simulate this HCI data during training with state-of-the-art user models, also called robot users, which aim to act similar to real users given interactive image segmentation tasks. We analyze the influence of chosen robot users, which mimic different types of users and scribble patterns, on the segmentation quality. We conclude that networks trained with robot users with the most spread out seeding patterns generalize well during inference with other robot users.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Literatur
Lewandowski RJ, Geschwind JF, Liapi E, et al. Transcatheter intraarterial therapies: rationale and overview. Radiol. 2011;259(3):641–657.
Strobel N, Meissner O, Boese J, et al. 3D imaging with flat-detector C-arm systems. Multislice CT. 2009; p. 33–51.
Ronneberger O, Fischer P, Brox T. U-net: convolutional networks for biomedical image segmentation. proc MICCAI. 2015; p. 234–241.
Xu N, Price B, Cohen S, et al. Deep interactive object selection. Proc CVPR. 2016; p. 373–381.
Liew JH,Wei Y, Xiong W, et al. Regional interactive image segmentation networks. Proc ICCV. 2017; p. 2746–2754.
Amrehn MP, Gaube S, Unberath M, et al. UI-net: interactive artificial neural networks for iterative image segmentation based on a user model. Proc VCBM. 2017; p. 143–147.
Amrehn MP, Steidl S, Kowarschik M, et al. Robust seed mask generation for interactive image segmentation. Proc NSS/MIC. 2017; p. 1–3.
Wang G, Zuluaga MA, Li W, et al. DeepIGeoS: a deep interactive geodesic framework for medical image segmentation. Trans Pattern Anal Mach Intell. 2018;.
Kohli P, Nickisch H, Rother C, et al. User-centric learning and evaluation of interactive segmentation systems. Proc IJCV. 2012;100(3):261–274.
Vezhnevets V, Konouchine V. GrowCut: interactive multi-label ND image segmentation by cellular automata. Comput Graph Appl (Graphicon). 2005; p. 150–156.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature
About this paper
Cite this paper
Amrehn, M., Strumia, M., Kowarschik, M., Maier, A. (2019). Interactive Neural Network Robot User Investigation for Medical Image Segmentation. In: Handels, H., Deserno, T., Maier, A., Maier-Hein, K., Palm, C., Tolxdorff, T. (eds) Bildverarbeitung für die Medizin 2019. Informatik aktuell. Springer Vieweg, Wiesbaden. https://doi.org/10.1007/978-3-658-25326-4_16
Download citation
DOI: https://doi.org/10.1007/978-3-658-25326-4_16
Published:
Publisher Name: Springer Vieweg, Wiesbaden
Print ISBN: 978-3-658-25325-7
Online ISBN: 978-3-658-25326-4
eBook Packages: Computer Science and Engineering (German Language)