Skip to main content

Machine Vision System for Orchard Management

  • Chapter
  • First Online:
Machine Vision and Navigation

Abstract

This chapter overviews different machine vision systems in agricultural applications. Several different applications are presented, but a machine vision system which estimates fruit yield, an example of an orchard management application, is discussed at length. From the farmer’s perspective, an early yield prediction serves as an early revenue estimate. From this prediction, resources, such as employees and storage space, can more efficiently be allocated, and future seasons can be better planned. The yield estimate is accomplished using a camera with a color filter that isolates the blossoms on a tree when the tree is in its full blossom. The blossoms in the resulting image can be counted and the yield estimated. An estimate during the blossom period, as compared to when the fruit has begun to mature, provides a crop yield prediction several months in advance. Discussed as well, in this chapter, is a machine vision system which navigates a robot through orchard rows. This system can be used in conjunction with the yield estimation system, but it has additional applications such as incorporating a water or pesticide system, which can treat the trees as it passes by. To be effective, this type of system, must consider the operating scene as it can limit or constraint the system effectiveness. Such systems tend to be unique to the operating environment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abbreviations

2D:

Two-dimensional

3D:

Three-dimensional

Cov:

Covariance

GPS:

Global Positioning System

IR:

Infrared

K I :

Integral gain

K P :

Proportional gain

LIDAR:

Light detection and ranging

NIR:

Near infrared

PI:

Proportional-plus-integral

RGB:

Red, Green, and Blue

RMS:

Root mean square

UAV:

Unmanned arial vehicle

UGV:

Unmanned ground vehicle

Var:

Variance

References

  1. Chen, Y., Chao, K., & Kim, M. S. (2002). Machine vision technology for agricultural applications. Computers and Electronics in Agriculture, 36, 173–191. https://doi.org/10.1016/s0168-1699(02)00100-x.

    Article  Google Scholar 

  2. Buluswar, S. D., & Draper, B. A. (1998). Color machine vision for autonomous vehicles. Engineering Applications of Artificial Intelligence, 11(2), 245–256. https://doi.org/10.1016/s0952-1976(97)00079-1.

    Article  Google Scholar 

  3. Deac, G. C., Deac, C. N., Popa, C. L., Ghinea, M., & Cotet, C. E. (2017). Machine vision in manufacturing processes and the digital twin of manufacturing architectures. DAAAM Proceedings of the 28th International DAAAM Symposium, 2017, 0733–0736. https://doi.org/10.2507/28th.daaam.proceedings.103.

    Article  Google Scholar 

  4. Sharan, R. V., & Onwubolu, G. C. (2014). Automating the process of work-piece recognition and location for a pick-and-place robot in a SFMS. International Journal of Image, Graphics and Signal Processing, 6(4), 9–17. https://doi.org/10.5815/ijigsp.2014.04.02.

    Article  Google Scholar 

  5. Emmi, L., Gonzalez-De-Soto, M., Pajares, G., & Gonzalez-De-Santos, P. (2014). Integrating sensory/actuation systems in agricultural vehicles. Sensors, 14(3), 4014–4049. https://doi.org/10.3390/s140304014.

    Article  Google Scholar 

  6. Matache, M., Persu, C., Nitu, M., & Gabriel, G. (2017). Vision system for spraying machines adaptive control. Engineering for Rural Development—International Scientific Conference, 24, 358–363. https://doi.org/10.22616/erdev2017.16.n071.

    Article  Google Scholar 

  7. Awcock, G. J., & Thomas, R. (1995). Applied image processing. New York: McGraw-Hill.

    Book  Google Scholar 

  8. Chao, K., Park, B., Chen, Y. R., Hruschka, W. R., & Wheaton, F. W. (2000). Design of a dual-camera system for poultry carcasses inspection. Applied Engineering in Agriculture, 16(5), 581–587.

    Article  Google Scholar 

  9. Bulanon, D. M., & Kataoka, T. (2010). Fruit detection system and an end effector for robotic harvesting of Fuji apples’, International commission of Agricultural and. Biosystems Engineering Journal, 12(1), 203–210.

    Google Scholar 

  10. Ni, Z., & Burks, T. F. (2013). Plant or tree reconstruction based on stereo vision. In Annual meeting of the American Society of Agricultural and Biological Engineers, 2013.

    Google Scholar 

  11. Bulanon, D. M., Burks, T. F., & Alchanatis, V. (2008). Study on temporal variation in citrus canopy using thermal imaging for citrus fruit detection. Biosystems Engineering, 101(2), 161–171.

    Article  Google Scholar 

  12. Corke, P. (2011). Robotics, Vision and Control: Fundamental Algorithms in MATLAB. Springer Tracts in Advanced Robotics, 73(6), 2011.

    MATH  Google Scholar 

  13. Sabzi, S., Abbaspour-Gilandeh, Y., & Javadikia, H. (2017). Machine vision system for the automatic segmentation of plants under different lighting conditions. Biosystems Engineering, 161, 157–173. https://doi.org/10.1016/j.biosystemseng.2017.06.021.

    Article  Google Scholar 

  14. Partel, V., Kakarla, S. C., & Ampatzidis, Y. (2019). Development and evaluation of a low-cost and smart technology for precision weed management utilizing artificial intelligence. Computers and Electronics in Agriculture, 157, 339–350. https://doi.org/10.1016/j.compag.2018.12.048.

    Article  Google Scholar 

  15. Ruiz-Altisent, N., Ruiz-Garcia, L., Moreda, G. P., Lu, R., HernandezSanchez, N., Correa, E. C., et al. (2010). Sensors for product characterization and quality of specialty crops—A review. Computers and Electronics in Agriculture, 74, 176–194.

    Article  Google Scholar 

  16. Davies, E. R. (1997). Machine vision: Theory, algorithms, practicalities. New York: Academic Press.

    Google Scholar 

  17. Guyer, D. E., Miles, G. E., Schreiber, M. M., Mitchell, O. R., & Vanderbilt, V. C. (1986). Machine vision and image processing for plant identification. Transactions of ASAE, 29(6), 1500–1507.

    Article  Google Scholar 

  18. Lee, W. S., Slaughter, D. C., & Giles, D. K. (1999). Robotic weed control system for tomatoes. Precision Agriculture, 1, 95–113.

    Article  Google Scholar 

  19. Bulanon, D. M., Lonai, J., Skovgard, H., & Fallahi, E. (2016). Evaluation of different irrigation methods for an apple orchard using an aerial imaging system. ISPRS International Journal of Geo-Information, 5, 79.

    Article  Google Scholar 

  20. Lindner, L., Sergiyenko, O., Rivas-López, M., Valdez-Salas, B., Rodríguez-Quiñonez, J. C., Hernández-Balbuena, D., et al. (2016, November). Machine vision system for UAV navigation. In Electrical Systems for Aircraft, Railway, Ship Propulsion and Road Vehicles & International Transportation Electrification Conference (ESARS-ITEC), International Conference on IEEE (pp. 1–6).

    Google Scholar 

  21. Lindner, L., Sergiyenko, O., Rivas-López, M., Ivanov, M., Rodríguez-Quiñonez, J. C., Hernández-Balbuena, D., et al. (2017, June). Machine vision system errors for unmanned aerial vehicle navigation. In Industrial Electronics (ISIE), 2017 IEEE 26th International Symposium on IEEE (pp. 1615–1620).

    Google Scholar 

  22. Wu, W. Y., Wang, M. J., & Liu, C. M. (1996). Automated inspection of printed circuit boards through machine vision. Computers in Industry, 28(2), 103–111.

    Article  Google Scholar 

  23. Miller, M. K., & Delwiche, M. J. (1989). A color vision system for peach grading. Transactions of ASAE, 32(4), 1484–1490.

    Article  Google Scholar 

  24. Bulanon, D. M., Burks, T. F., Kim, D. G., & Ritenour, M. A. (2013). Citrus black spot detection using hyperspectral image analysis. Agricultural Engineering International: CIGR Journal, 15(3), 171–180.

    Google Scholar 

  25. Rehkugler, G. E., & Throop, J. A. (1986). Apple sorting with machine vision. Transactions of ASAE, 29(10), 1388–1397.

    Article  Google Scholar 

  26. Costa, C., Antonucci, F., Pallottino, F., Aguzzi, J., Sun, D. W., & Menesatti, P. (2011). Shape analysis of agricultural products: A review of recent research advances and potential to computer vision. Food and Bioprocess Technology, 4, 673–692.

    Article  Google Scholar 

  27. Morimoto, T., Takeuchi, T., Miyata, H., & Hashimoto, Y. (2000). Pattern recognition of fruit shape based on the concept of chaos and neural networks. Computers and Electronics in Agriculture, 26, 171–186.

    Article  Google Scholar 

  28. Pallottino, F., Menesatti, C., Costa, C., Paglia, G., De Salvador, F. R., & Lolletti, D. (2010). Image analysis techniques for automated hazelnut peeling determination. Food and Bioprocess Technology, 3(1), 155–159.

    Article  Google Scholar 

  29. Parrish, E., & Goksel, A. K. (1977). Pictorial pattern recognition applied to fruit harvesting. Transactions of ASAE, 20, 822–827.

    Article  Google Scholar 

  30. Slaughter, D., & Harrel, R. C. (1987). Color vision in robotic fruit harvesting. Transactions of ASAE, 30(4), 1144–1148.

    Article  Google Scholar 

  31. Bulanon, D. M., Burks, T., & Alchanatis, V. (2009). Image fusion of visible and thermal images for fruit detection. Biosystems Engineering, 103(1), 12–22.

    Article  Google Scholar 

  32. Ehsani, R., & Karimi, D. (2010). Yield monitors for specialty crops. Landbauforschung Völkenrode, Special Issue, 340, 31–43.

    Google Scholar 

  33. Maja, J. M., Campbell, T., Neto, J. C., & Astillo, P. (2016). Predicting cotton yield of small field plots in a cotton breeding program using UAV imagery data. In Proc SPIE 986, Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, 98660C. Retrieved May 17, 2016.

    Google Scholar 

  34. Tumbo, S. D., Whitney, J. D., Miller, W. M., & Wheaton, T. A. (2002). Development and testing of a citrus yield monitor. Applied Engineering in Agriculture, 18, 399–403.

    Google Scholar 

  35. Annamalai, P., & Lee, W. S. (2003). Citrus yield mapping system using machine vision, 2003 ASAE Annual Meeting. American Society of Agricultural and Biological Engineers.

    Google Scholar 

  36. Stajnko, D., Rakun, J., & Blanke, M. (2009). Modelling apple fruit yield using image analysis for fruit colour, shape and texture. European Journal of Horticultural Science, 74(6), 260–267.

    Google Scholar 

  37. Lee, W. S., Alchanatis, V., Yang, C., Hirafuji, M., Moshou, D., & Li, C. (2010). Sensing technologies for precision specialty crop production. Computers and Electronics in Agriculture, 74, 2–33.

    Article  Google Scholar 

  38. Zaman, Q. U., Swain, K. C., Schumann, A. W., & Percival, D. C. (2010). Automated, low-cost yield mapping of wild blueberry fruit. Applied Engineering in Agriculture, 26(2), 225–232.

    Article  Google Scholar 

  39. Wang, Q., Nuske, S., Bergerman, M., & Singh, S. (2012). Automated crop yield estimation for apple orchards. In Proceedings of the International symposim on Experimental Robotics, June 2012, Quebec City.

    Google Scholar 

  40. Cheng, H., Damerow, L., Sun, Y., & Blanke, M. (2017). Early yield prediction using image analysis of apple fruit and tree canopy features with neural networks. Journal of Imaging, 3, 6.

    Article  Google Scholar 

  41. MATLAB. (2016). MATLAB and image processing toolbox release. Natick, MA: The MathWorks.

    Google Scholar 

  42. Khodaskar, H. V., & Mane, S. (2017). Human face detection & recognition using raspberry Pi. International Journal of Advanced Engineering, Management and Science, 1–2. https://doi.org/10.24001/icsesd2017.50.

  43. Schindelin, J., Arganda-Carreras, I., Frise, E., Kaynig, V., Longair, M., Pietzsch, T., et al. (2012). Fiji: An open-source platform for biological-image analysis. Nature Methods, 9(7), 676–682. https://doi.org/10.1038/nmeth.2019.

    Article  Google Scholar 

  44. Bertsekas, D. P., & Tsitsiklis, J. N. (2008). Introduction to probability (2nd ed.). Belmont: Athena Scientific.

    Google Scholar 

  45. Lay, D. C. (2003). Linear algebra and its applications (3rd ed.). Boston: Addison-Wesley.

    Google Scholar 

  46. Zhang, L., Zhang, H., Chen, Y., Dai, S., Li, X., Imou, K., Liu, Z., & Li, M. (2019). Real-time monitoring of optimum timing for harvesting fresh tea leaves based on machine vision. International Journal of Agricultural & Biological Engineering, 12(1), 6–9.

    Article  Google Scholar 

  47. Llorens, J., Gil, E., Llop, J., & Escolà, A. (2011). Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards: Advances to improve pesticide application methods. Sensors, 11(2), 2177–2194. https://doi.org/10.3390/s110202177.

    Article  Google Scholar 

  48. Geiger, A., Ziegler, J., & Stiller, C. (2011). StereoScan: Dense 3d reconstruction in real-time. In 2011 IEEE Intelligent Vehicles Symposium (IV). https://doi.org/10.1109/ivs.2011.5940405.

    Chapter  Google Scholar 

  49. Rovira-Más, F., Zhang, Q., & Reid, J. F. (2008). Stereo vision three-dimensional terrain maps for precision agriculture. Computers and Electronics in Agriculture, 60(2), 133–143. https://doi.org/10.1016/j.compag.2007.07.007.

    Article  Google Scholar 

  50. Lee, K., & Ehsani, R. (2008). A laser-scanning system for quantification of tree-geometric characteristics. In 2008 Providence, Rhode Island, June 29–July 2, 2008. https://doi.org/10.13031/2013.25003

  51. Richey, C. B. (1959). “Automatic pilot” for farm tractors. Agricultural Engineering, 40(2), 78–79, 93.

    Google Scholar 

  52. Tillet, N. D. (1991). Automatic guidance sensors for agricultural field machines: A review. Journal of Agricultural Engineering Research, 50, 167–187.

    Article  Google Scholar 

  53. Reid, J. F., Zhang, Q., Noguchi, N., & Dickson, M. (2000). Agricultural automatic guidance in North America. Computers and Electronics in Agriculture, 25(1/2), 154–168.

    Google Scholar 

  54. Radcliffe, J., Cox, J., & Bulanon, D. M. (2018). Machine vision for orchard navigation. Computers in Industry, 98, 165–171. https://doi.org/10.1016/j.compind.2018.03.008.

    Article  Google Scholar 

  55. Bolton, W. (2001). Mechatronics, electronic control systems in mechanical and electrical engineering (2nd ed.). Boston: Addison Wesley Longman.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Duke M. Bulanon .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Bulanon, D.M., Hestand, T., Nogales, C., Allen, B., Colwell, J. (2020). Machine Vision System for Orchard Management. In: Sergiyenko, O., Flores-Fuentes, W., Mercorelli, P. (eds) Machine Vision and Navigation. Springer, Cham. https://doi.org/10.1007/978-3-030-22587-2_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-22587-2_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-22586-5

  • Online ISBN: 978-3-030-22587-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics