Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot

Abstract

Autonomous weeding robots are a productive and more sustainable solution over traditional, labor-intensive weed control practices such as chemical weeding that are harmful to the environment when used excessively. To achieve a fully autonomous weed control operation, the robots need to be precisely guided through the crop rows without damaging rice plants and they should be able to detect the end of the crop row and make turns to change rows. This research attempted to integrate GNSS, compass and machine vision into a rice field weeding robot to achieve fully autonomous navigation for the weeding operation. A novel crop row detection algorithm was developed to extract the four immediate rows spanned by a camera mounted at the front of the robot. The extracted rows were used to determine a guide-line that was used to precisely maneuver the robot along the crop rows with an accuracy of less than a hundred millimeters in variable circumstances such as weed intensity, growth stage of plants and weather conditions. The GNSS and compass were used for locating the robot within the field. A state-based control system was implemented to integrate the GNSS, compass and vision guidance to efficiently navigate the weeding robot through a pre-determined route that covers the entire field without damaging rice plants. The proposed system was experimentally determined to deliver good performance in low weed concentrations with an accuracy of less than 2.5° in heading compensation and an average deviation from the ideal path of 45.9 mm. Though this accuracy dropped when the weed concentration increased, the system was still able to navigate the robot without inflicting any serious damage to the plants.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

References

  1. Asif, M., Hussain, S., Israr, A., & Shaikh, M. F. (2010). A vision system for autonomous weed detection robot. International Journal of Computer and Electrical Engineering,2(3), 486–491. https://doi.org/10.7763/IJCEE.2010.V2.182.

    Article  Google Scholar 

  2. Bechar, A., & Eben-Chaime, M. (2014). Hand-held computers to increase accuracy and productivity in agricultural work study. International Journal of Productivity and Performance Management,63(2), 194–208. https://doi.org/10.1108/IJPPM-03-2013-0040.

    Article  Google Scholar 

  3. Bechar, A., & Vigneualt, C. (2016). Agricultural robots for field operations: Concepts and components. Biosystems Engineering,149, 94–111. https://doi.org/10.1016/j.biosystemseng.2016.06.014.

    Article  Google Scholar 

  4. Bengochea-Guevara, J. M., Conesa-Muñoz, J., Andújar, D., & Ribeiro, A. (2016). Merge fuzzy visual serving and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors,16(3), 276. https://doi.org/10.3390/s16030276.

    Article  PubMed  Google Scholar 

  5. Bogue, R. (2016). Robots poised to revolutionize agriculture. Industrial Robot,43(5), 450–456. https://doi.org/10.1108/IR-05-2016-0142.

    Article  Google Scholar 

  6. Choi, K. H., Han, S. K., Han, S. H., Park, K., Kim, K., & Kim, S. (2015). Morphology-based guidance line extraction for an autonomous weeding. Computers and Electronics in Agriculture,113, 266–274. https://doi.org/10.1016/j.compag.2015.02.014.

    Article  Google Scholar 

  7. Edan, Y., Han, S., & Kondo, N. (2009). Automation in Agriculture. In S. Nof (Ed.), Springer handbook of automation (pp. 1095–1128). Berlin, Heidelberg, Germany: Springer. https://doi.org/10.1007/978-3-540-78831-7_63.

    Google Scholar 

  8. Gee, C., Bossu, J., Jones, G., & Truchetet, F. (2008). Crop/weed discrimination in perspective agronomic images. Computers and Electronics in Agriculture,60(1), 49–59. https://doi.org/10.1016/j.compag.2007.06.003.

    Article  Google Scholar 

  9. Gottschalk, R., Xavier, P. B., Angela, R., Gonzalo, P., & Alvaro, S. M. (2010). Real-time image processing for the guidance of a small agricultural field inspection vehicle. International Journal of Intelligent Systems Technologies and Applications,8, 434–443. https://doi.org/10.1109/MMVIP.2008.4749582.

    Article  Google Scholar 

  10. Grift, T., Zhang, Q., & Ting, K. C. (2008). A review of automation and robotics for the bioindustry. Journal of Biomechatronics Engineering,1(1), 37–54.

    Google Scholar 

  11. Hague, T., Tillett, N. D., & Wheeler, H. (2006). Automated crop and weed monitoring in widely spaced cereals. Precision Agriculture,7(1), 21–32. https://doi.org/10.1007/s11119-005-6787-1.

    Article  Google Scholar 

  12. Hansen, K., & Andersen, J. D. (1997). Understanding the Hough transform: Hough cell support and its utilisation. Image and Vision Computing,15(3), 205–218. https://doi.org/10.1016/S0262-8856(96)01128-6.

    Article  Google Scholar 

  13. Hashimoto, Y., Murase, H., Morimoto, T., & Torii, T. (2001). Intelligent systems for agriculture in Japan. IEEE Control Systems Magazine,21(5), 71–85. https://doi.org/10.1109/37.954520.

    Article  Google Scholar 

  14. Huber, P. J. (1981). Regression (pp. 153–198). New York, USA: Wiley. https://doi.org/10.1002/0471725250.ch7.

    Google Scholar 

  15. Ji, R., & Qi, L. (2011). Crop-row detection algorithm based on Random Hough Transformation. Mathematical and Computer Modelling,54(3–4), 1016–1020. https://doi.org/10.1016/j.mcm.2010.11.030.

    Article  Google Scholar 

  16. Kise, M., & Zhang, Q. (2008). Development of a stereovision sensing system for 3D crop row structure mapping and tractor guidance. Biosystems Engineering,101(2), 191–198. https://doi.org/10.1016/j.biosystemseng.2008.08.001.

    Article  Google Scholar 

  17. Li, Y., & Lafruit, G. (2016). Convergent multi-view geometric error correction with pseudo-inverse projection homography. In: 2016 IEEE International Conference on 3D Imaging (IC3D), 1–8. https://doi.org/10.1109/IC3D.2016.7823461.

  18. Marchant, J. A., & Brivot, R. (1995). Real time tracking of plant rows using a Hough transform. Real-Time Imaging,1(5), 363–371. https://doi.org/10.1006/rtim.1995.1036.

    Article  Google Scholar 

  19. Montalvo, M., Pajares, G., Guerrero, J. M., Romeo, J., Guijarro, M., Ribeiro, A., et al. (2012). Automatic detection of crop rows in maize fields with high weeds pressure. Expert Systems with Applications,39, 11889–11897. https://doi.org/10.1100/2012/484390.

    Article  Google Scholar 

  20. Mousazadeh, H. (2013). A technical review on navigation systems of agricultural autonomous off-road vehicles. Journal of Terramechanics,50(3), 211–232. https://doi.org/10.1016/j.jterra.2013.03.004.

    Article  Google Scholar 

  21. Muad, A.M., Hussain, A., Samad, S.A., Mustaffa, M.T., & Majlis, B.Y. (2004). Implementation of inverse perspective mapping algorithm for the development of an automatic lane tracking system. In: 2004 IEEE Region 10 Conference TENCON 2004, 1(A), 207–210. https://doi.org/10.1109/tencon.2004.1414393.

  22. Nagasaka, Y., Tamaki, K., Nishiwaki, K., Saito, M., Kikuchi, Y., Motobayashi, K. (2013). A Global Positioning System guided automated rice transplanter. In: 4th International Federation of Automatic Control (IFAC) Conference on Modelling and Control in Agriculture, Horticulture and Post-Harvest Industry, 46(18), 1–335. https://doi.org/10.3182/20130828-2-SF-3019.00009.

  23. Rovira-Mas, F., Zhang, Q., & Reid, J. F. (2008). Stereo vision three-dimensional terrain maps for precision agriculture. Computers and Electronics in Agriculture,60(2), 133–143. https://doi.org/10.1016/j.compag.2007.07.007.

    Article  Google Scholar 

  24. Russ, J. C. (2011). Processing Binary Images. In J. C. Russ (Ed.), The image processing handbook (6th ed., pp. 443–509). Bocca Raton, Florida, USA: CRC Press, Taylor & Francis Group.

    Google Scholar 

  25. Shalal, N., Low, T., McCarthy, C., & Hancock, N. (2013). A review of autonomous navigation systems in agricultural environments. In 2013 Society for Engineering in Agriculture Conference: Innovative Agricultural Technologies for a Sustainable Future (pp. 10–25). Barton, ACT: Engineers Australia.

  26. Sogaard, H., & Olsen, H. J. (2003). Determination of crop rows by image analysis. Computers and Electronics in Agriculture,38(2), 141–158. https://doi.org/10.1016/S0168-1699(02)00140-0.

    Article  Google Scholar 

  27. Veness, C. (2002). Calculate distance, bearing and more between Latitude/Longitude points. Retrieved March 02, 2019, from https://www.movable-type.co.uk/scripts/latlong.html.

  28. Woebbecke, D. M., Meyer, G. E., Bargen, K. V., & Mortensen, D. A. (1995). Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE,38(1), 259–269. https://doi.org/10.13031/2013.27838.

    Article  Google Scholar 

  29. Yang, W., Wang, S., Zhao, X., Zhang, J., & Feng, J. (2015). Greenness identification based on HSV decision tree. Information Processing in Agriculture,2(3/4), 149–160. https://doi.org/10.1016/j.inpa.2015.07.003.

    Article  Google Scholar 

  30. Zhang, Y., & Xiao, D. (2014). An image encryption scheme based on rotation matrix bit-level permutation and block diffusion. Communications in Nonlinear Science and Numerical Simulation,19(1), 74–82. https://doi.org/10.1016/j.cnsns.2013.06.031.

    Article  Google Scholar 

Download references

Acknowledgements

This research was financially supported by Agriculture and Research Development Agency (ARDA), Thailand. We would like to thank Prof. Manukid Parnichkun and Assoc. Prof. Matthew N. Dailey of Asian Institute of Technology, Thailand for their insight and assistance in this research. We also thank our colleagues at the Asian Institute of Technology for the technical expertise and support that greatly aided this research.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Sabeethan Kanagasingham.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

figurea
figureb

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Kanagasingham, S., Ekpanyapong, M. & Chaihan, R. Integrating machine vision-based row guidance with GPS and compass-based routing to achieve autonomous navigation for a rice field weeding robot. Precision Agric 21, 831–855 (2020). https://doi.org/10.1007/s11119-019-09697-z

Download citation

Keywords

  • Autonomous navigation
  • Rice weeding
  • Computer vision-based row guidance
  • Location using GNSS and compass