Extract Bone Parts Without Human Prior: End-to-end Convolutional Neural Network for Pediatric Bone Age Assessment
Pediatric bone age assessment (BAA) is a common clinical practice to investigate endocrinology, genetic and growth disorders of children. The morphological characters of different specific bone parts, such as wrist and phalanx, have important reference significance in BAA. Previous deep learning approaches can be divided into two branches, (1) the single-stage structure ignores the attention on specific bone parts, thus it can be trained end-to-end but suffers from low accuracy, (2) the multi-stage structure extracts the bone parts with human prior, thus it exhibits high accuracy but suffers from model generalization and resource consumption problem. To enable an end-to-end training method extracting discriminative bone parts automatically without human prior, in this paper, we propose a novel single-stage Attention-Recognition Convolutional Neural Network (AR-CNN). The AR-CNN consists of one attention agent for discriminative bone parts proposing and one recognition agent for feature learning and age assessment. The attention agent can discover and extract bone parts automatically, meanwhile the recognition agent can learn the features from the proposing bone parts and assess the bone age. Furthermore, the assessment result will be fed back to attention agent for the optimization of bone parts extracting. Therefore, the two agents can reinforce each other mutually and the overall network can be trained end-to-end without human prior. To the best of our knowledge, this is the first end-to-end structure to extract bone parts for BAA without segmentation, detection and human prior. Experimental results show that our approach achieves state-of-the-art accuracy on the public RSNA datasets with mean absolute error(MAE) of 4.38 months.
KeywordsBone age assessment Deep learning Object detection
This work is supported by the Huawei-USTC Joint Innovation Project on Machine Vision Technology (FA2018111122). And we would like to thank Brain-inspired Technology Corporation (http://www.leinao.ai/) for its calculation support.
- 2.Stern, D., Ebner, T., Bischof, H., Grassegger, S., Ehammer, T., Urschler, M.: Fully automatic bone age estimation from left hand MR images. In: Golland, P., Hata, N., Barillot, C., Hornegger, J., Howe, R. (eds.) MICCAI 2014. LNCS, vol. 8674, pp. 220–227. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10470-6_28CrossRefGoogle Scholar
- 5.Min, S., Chen, X., Zha, Z.J., Wu, F., Zhang, Y.: A two-stream mutual attention network for semi-supervised biomedical segmentation with noisy labels. In: AAAI 2019 (2018)Google Scholar
- 9.Ren, X., Li, T., Wang, Q.: Regression convolutional neural network for automated pediatric bone age assessment from hand radiograph. IEEE J. Biomed. Health Inform. pp(c), 1 (2018)Google Scholar
- 10.Wang, S., Shen, Y., Zeng, D., Hu, Y.: Bone age assessment using convolutional neural networks. In: 2018 International Conference on Artificial Intelligence and Big Data (ICAIBD), pp. 175–178. IEEE (2018)Google Scholar
- 11.Iglovikov, V.I., Rakhlin, A., Kalinin, A.A., Shvets, A.A.: Paediatric bone age assessment using deep convolutional neural networks. In: Stoyanov, D., et al. (eds.) DLMIA/ML-CDS -2018. LNCS, vol. 11045, pp. 300–308. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00889-5_34CrossRefGoogle Scholar
- 12.Hao, P., Chen, Y., Chokuwa, S., Wu, F., Bai, C.: Skeletal bone age assessment based on deep convolutional neural networks. In: Hong, R., Cheng, W.-H., Yamasaki, T., Wang, M., Ngo, C.-W. (eds.) PCM 2018. LNCS, vol. 11165, pp. 408–417. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00767-6_38CrossRefGoogle Scholar
- 13.Singh, K.K., Lee, Y.J.: Hide-and-seek: forcing a network to be meticulous for weakly-supervised object and action localization. In: ICCV, pp. 3544–3553. IEEE (2017)Google Scholar
- 14.Ren, S., He, K., Girshick, R., Sun, J.: Faster R-CNN: towards real-time object detection with region proposal networks. In: Advances in neural information processing systems, pp. 91–99 (2015)Google Scholar