Abstract
Recently, a body of computer vision research has studied the task of high-throughput plant phenotyping (measurement of plant attributes). The goal is to more rapidly and more accurately estimate plant properties as compared to conventional manual methods. In this work, we develop a method to measure two primary yield attributes of interest; stalk count and stalk width that are important for many broad-acre annual crops (sorghum, sugarcane, corn, maize for example). Prior work of using convolutional deep neural networks for plant analysis has either focused on object detection or dense image segmentation. In our work, we develop a novel pipeline that accurately extracts both detected object regions and dense semantic segmentation for extracting both stalk counts and stalk width. A ground-robot called the Robotanist is used to deploy a high-resolution stereo imager to capture dense image data of experimental plots of Sorghum plants. We ground-truth validate data extracted using two humans who assess the traits independently and we compare both accuracy and efficiency of human versus robotic measurements. Our method yields R-squared correlation of 0.88 for stalk count and a mean absolute error of 2.77 mm where average stalk width is 14.354 mm. Our approach is 30 times faster for stalk count and 270 times faster for stalk width measurement.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
United Nations Department of Economic and Social Affairs Population Division.: http://www.unpopulation.org. Accessed 10 Oct 2014
Mueller-Sim, T., et al.: The Robotanist: a ground-based agricultural robot for high-throughput crop phenotyping. In: IEEE International Conference on Robotics and Automation (ICRA), Singapore, Singapore, May 29–June 3 2017
Singh, A., et al.: Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci. 21(2), 110–124 (2016)
Tsaftaris, S.A., Minervini, M., Scharr, H.: Machine learning for plant phenotyping needs image processing. Trends Plant Sci. 21(12), 989–991 (2016)
Sugiura, R., et al.: Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 148, 1–10 (2016)
Pothen, Z., Nuske, S.: Automated assessment and mapping of grape quality through image-based color analysis. IFAC-PapersOnLine 49(16), 72–78 (2016)
Jimenez, A.R., Ceres, R., Pons, J.L.: A survey of computer vision methods for locating fruit on trees. Trans. ASAE-Am. Soc. Agric. Eng. 43(6), 1911–1920 (2000)
He, K., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)
Ren, S., et al.: Faster r-cnn: towards real-time object detection with region proposal networks. In: Advances in Neural Information Processing Systems (2015)
Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)
Sa, I., et al.: On visual detection of highly-occluded objects for harvesting automation in horticulture. In: ICRA (2015)
Hung, C., et al.: Orchard fruit segmentation using multi-spectral feature learning. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2013)
McCool, C., Ge, Z., Corke, P.: Feature learning via mixtures of dcnns for finegrained plant classification. In: Working Notes of CLEF 2016 Conference (2016)
Mohanty, S.P., Hughes, D.P., Salathé, M.: Using deep learning for image-based plant disease detection. Front. Plant Sci. 7 (2016)
Pound, M.P., et al.: Deep machine learning provides state-of-the-art performance in image-based plant phenotyping. bioRxiv, 053033 (2016)
Mohammed Amean, Z., et al.: Automatic plant branch segmentation and classification using vesselness measure. In: Proceedings of the Australasian Conference on Robotics and Automation (ACRA 2013). Australasian Robotics and Automation Association (2013)
Baweja, H., Parhar, T., Nuske, S.: Early-season vineyard shoot and leaf estimation using computer vision techniques. ASABE (2017) (accepted)
Paproki, A., et al.: Automated 3D segmentation and analysis of cotton plants. In: 2011 International Conference on Digital Image Computing Techniques and Applications (DICTA). IEEE (2011)
Bargoti, S., et al.: A pipeline for trunk detection in trellis structured apple orchards. J. Field Robot. 32(8), 1075–1094 (2015)
Fitzgibbon, A.W., Fisher, R.B.: A buyer’s guide to conic fitting. DAI Research Paper (1996)
Hirschmuller, H.: Stereo processing by semiglobal matching and mutual information. IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Baweja, H.S., Parhar, T., Mirbod, O., Nuske, S. (2018). StalkNet: A Deep Learning Pipeline for High-Throughput Measurement of Plant Stalk Count and Stalk Width. In: Hutter, M., Siegwart, R. (eds) Field and Service Robotics. Springer Proceedings in Advanced Robotics, vol 5. Springer, Cham. https://doi.org/10.1007/978-3-319-67361-5_18
Download citation
DOI: https://doi.org/10.1007/978-3-319-67361-5_18
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-67360-8
Online ISBN: 978-3-319-67361-5
eBook Packages: EngineeringEngineering (R0)