Skip to main content

StalkNet: A Deep Learning Pipeline for High-Throughput Measurement of Plant Stalk Count and Stalk Width

  • Conference paper
  • First Online:
Field and Service Robotics

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 5))

Abstract

Recently, a body of computer vision research has studied the task of high-throughput plant phenotyping (measurement of plant attributes). The goal is to more rapidly and more accurately estimate plant properties as compared to conventional manual methods. In this work, we develop a method to measure two primary yield attributes of interest; stalk count and stalk width that are important for many broad-acre annual crops (sorghum, sugarcane, corn, maize for example). Prior work of using convolutional deep neural networks for plant analysis has either focused on object detection or dense image segmentation. In our work, we develop a novel pipeline that accurately extracts both detected object regions and dense semantic segmentation for extracting both stalk counts and stalk width. A ground-robot called the Robotanist is used to deploy a high-resolution stereo imager to capture dense image data of experimental plots of Sorghum plants. We ground-truth validate data extracted using two humans who assess the traits independently and we compare both accuracy and efficiency of human versus robotic measurements. Our method yields R-squared correlation of 0.88 for stalk count and a mean absolute error of 2.77 mm where average stalk width is 14.354 mm. Our approach is 30 times faster for stalk count and 270 times faster for stalk width measurement.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. United Nations Department of Economic and Social Affairs Population Division.: http://www.unpopulation.org. Accessed 10 Oct 2014

  2. Mueller-Sim, T., et al.: The Robotanist: a ground-based agricultural robot for high-throughput crop phenotyping. In: IEEE International Conference on Robotics and Automation (ICRA), Singapore, Singapore, May 29–June 3 2017

    Google Scholar 

  3. Singh, A., et al.: Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci. 21(2), 110–124 (2016)

    Google Scholar 

  4. Tsaftaris, S.A., Minervini, M., Scharr, H.: Machine learning for plant phenotyping needs image processing. Trends Plant Sci. 21(12), 989–991 (2016)

    Article  Google Scholar 

  5. Sugiura, R., et al.: Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 148, 1–10 (2016)

    Google Scholar 

  6. Pothen, Z., Nuske, S.: Automated assessment and mapping of grape quality through image-based color analysis. IFAC-PapersOnLine 49(16), 72–78 (2016)

    Article  Google Scholar 

  7. Jimenez, A.R., Ceres, R., Pons, J.L.: A survey of computer vision methods for locating fruit on trees. Trans. ASAE-Am. Soc. Agric. Eng. 43(6), 1911–1920 (2000)

    Article  Google Scholar 

  8. He, K., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)

    Google Scholar 

  9. Ren, S., et al.: Faster r-cnn: towards real-time object detection with region proposal networks. In: Advances in Neural Information Processing Systems (2015)

    Google Scholar 

  10. Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)

    Google Scholar 

  11. Sa, I., et al.: On visual detection of highly-occluded objects for harvesting automation in horticulture. In: ICRA (2015)

    Google Scholar 

  12. Hung, C., et al.: Orchard fruit segmentation using multi-spectral feature learning. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE (2013)

    Google Scholar 

  13. McCool, C., Ge, Z., Corke, P.: Feature learning via mixtures of dcnns for finegrained plant classification. In: Working Notes of CLEF 2016 Conference (2016)

    Google Scholar 

  14. Mohanty, S.P., Hughes, D.P., Salathé, M.: Using deep learning for image-based plant disease detection. Front. Plant Sci. 7 (2016)

    Google Scholar 

  15. Pound, M.P., et al.: Deep machine learning provides state-of-the-art performance in image-based plant phenotyping. bioRxiv, 053033 (2016)

    Google Scholar 

  16. Mohammed Amean, Z., et al.: Automatic plant branch segmentation and classification using vesselness measure. In: Proceedings of the Australasian Conference on Robotics and Automation (ACRA 2013). Australasian Robotics and Automation Association (2013)

    Google Scholar 

  17. Baweja, H., Parhar, T., Nuske, S.: Early-season vineyard shoot and leaf estimation using computer vision techniques. ASABE (2017) (accepted)

    Google Scholar 

  18. Paproki, A., et al.: Automated 3D segmentation and analysis of cotton plants. In: 2011 International Conference on Digital Image Computing Techniques and Applications (DICTA). IEEE (2011)

    Google Scholar 

  19. Bargoti, S., et al.: A pipeline for trunk detection in trellis structured apple orchards. J. Field Robot. 32(8), 1075–1094 (2015)

    Google Scholar 

  20. Fitzgibbon, A.W., Fisher, R.B.: A buyer’s guide to conic fitting. DAI Research Paper (1996)

    Google Scholar 

  21. Hirschmuller, H.: Stereo processing by semiglobal matching and mutual information. IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Harjatin Singh Baweja .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Baweja, H.S., Parhar, T., Mirbod, O., Nuske, S. (2018). StalkNet: A Deep Learning Pipeline for High-Throughput Measurement of Plant Stalk Count and Stalk Width. In: Hutter, M., Siegwart, R. (eds) Field and Service Robotics. Springer Proceedings in Advanced Robotics, vol 5. Springer, Cham. https://doi.org/10.1007/978-3-319-67361-5_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-67361-5_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-67360-8

  • Online ISBN: 978-3-319-67361-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics