Advertisement

Precision Agriculture

, Volume 20, Issue 2, pp 423–444 | Cite as

A multi-sensor robotic platform for ground mapping and estimation beyond the visible spectrum

  • Annalisa MilellaEmail author
  • Giulio Reina
  • Michael Nielsen
Article

Abstract

Accurate soil mapping is critical for a highly-automated agricultural vehicle to successfully accomplish important tasks including seeding, ploughing, fertilising and controlled traffic, with limited human supervision, ensuring at the same time high safety standards. In this research, a multi-sensor ground mapping and characterisation approach is proposed, whereby data coming from heterogeneous but complementary sensors, mounted on-board an unmanned rover, are combined to generate a multi-layer map of the environment and specifically of the supporting ground. The sensor suite comprises both exteroceptive and proprioceptive devices. Exteroceptive sensors include a stereo camera, a visible and near-infrared camera and a thermal imager. Proprioceptive data consist of the vertical acceleration of the vehicle sprung mass as acquired by an inertial measurement unit. The paper details the steps for the integration of the different sensor data into a unique multi-layer map and discusses a set of exteroceptive and proprioceptive features for soil characterisation and change detection. Experimental results obtained with an all-terrain vehicle operating on different ground surfaces are presented. It is shown that the proposed technologies could be potentially used to develop all-terrain self-driving systems in agriculture. In addition, multi-modal soil maps could be useful to feed farm management systems that would present to the user various soil layers incorporating colour, geometric, spectral and mechanical properties.

Keywords

Agricultural robotics Intelligent vehicles Soil mapping Multi-spectral sensing Vibration response 

Notes

Acknowledgements

The financial support of the FP7 ERA-NET ICT-AGRI 2 through the grant Simultaneous Safety and Surveying for Collaborative Agricultural Vehicles (S3-CAV) (Id. 29839) is gratefully acknowledged. The authors would also like to thank the National Research Council (CNR), Italy, for supporting this work under the CNR 2016 Short Term Mobility (STM) program.

Author contributions

Annalisa Milella and Giulio Reina made significant contributions to the conception and design of the research. They mainly dealt with data analysis and interpretation, and writing of the manuscript. Michael Nielsen focused on the development of the multi-sensor system, the experimental activities and data analysis.

References

  1. Ball, D., Ross, P., English, A., Patten, T., Upcroft, B., Fitch, R., et al. (2015). Robotics for sustainable broad-acre agriculture field and service robotics. Springer Tracts in Advanced Robotics, 105, 439–453.CrossRefGoogle Scholar
  2. Bouguet, J. Y. (2008). Camera calibration toolbox for Matlab. Retrieved November 16, 2017, from, http://www.vision.caltech.edu/bouguetj/calib_doc/.
  3. Brooks, C., & Iagnemma, K. (2005). Vibration-based terrain classification for planetary exploration rover. IEEE Transactions on Robotics, 21(6), 1185–1191.CrossRefGoogle Scholar
  4. Dong, J., Burnham, J. G., Boots, B., Rains, G., & Dellaert, F. (2017). 4D crop monitoring: Spatio-temporal reconstruction for agriculture. In IEEE International Conference on Robotics and Automation (ICRA), Singapore (pp. 3878–3885).Google Scholar
  5. Dupont, E., Moore, C., Collins, E., & Coyle, E. (2008). Frequency response method for terrain classification in autonomous ground vehicles. Autonomous Robots, 24(4), 337–347.CrossRefGoogle Scholar
  6. Fehlman, W. L., & Hinders, M. K. (2009). Mobile robot navigation with intelligent infrared image interpretation. London, UK: Springer.  https://doi.org/10.1007/978-1-978-1-84882-509-3. ISBN 978-1-84882-508-6
  7. Geiger, A., Ziegler, J., & Stiller, C. (2011). StereoScan: Dense 3D reconstruction in real-time. In 2011 IEEE intelligent vehicles symposium (pp. 963–968).Google Scholar
  8. Gevers, T., & Smeulders, A. W. (1999). Color-based object recognition. Pattern Recognition, 32(3), 453–464.CrossRefGoogle Scholar
  9. Hirschmuller, H. (2005). Accurate and efficient stereo processing by semi-global matching and mutual information. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05) (pp. 807–814).Google Scholar
  10. Kragh, M., Jorgensen, R., & Pedersen, H. (2015). Object detection and terrain classification in agricultural fields using 3D lidar data. Lecture Notes in Computer Science, 9163, 188–197.CrossRefGoogle Scholar
  11. Krebs, A., Pradalier, C., & Siegwart, R. (2009). Comparison of boosting based terrain classification using proprioceptive and exteroceptive data. In O. Khatib, V. Kumar, & G. J. Pappas (Eds.), Experimental robotics. Springer tracts in advanced robotics (Vol. 54, pp. 93–102). Berlin, Germany: Springer.Google Scholar
  12. Marinello, F., Pezzuolo, A., Gasparini, F., Arvidsson, J., & Sartori, L. (2015). Application of the Kinect sensor for dynamic soil surface characterization. Precision Agriculture, 16(6), 601–612.CrossRefGoogle Scholar
  13. Milella, A., Nielsen, M., & Reina, G. (2017). Sensing in the visible spectrum and beyond for terrain estimation in precision agriculture. In J. A. Taylor, D. Cammarano, A. Prashar, A. Hamilton (Eds.), Proceedings of the 11th European Conference on Precision Agriculture (ECPA 2017). Advances in Animal Biosciences (Vol. 8. No. 2, pp. 423–429).Google Scholar
  14. Milella, A., Reina, G., & Underwood, J. (2015). A self-learning framework for statistical ground classification using radar and monocular vision. Journal of Field Robotics, 32(1), 20–41.CrossRefGoogle Scholar
  15. Milella, A., Reina, G., Underwood, J., & Douillard, B. (2014). Visual ground segmentation by radar supervision. Robotics and Autonomous Systems, 62(5), 696–706.CrossRefGoogle Scholar
  16. Mulder, V., de Bruin, S., Schaepman, M., & Mayr, T. (2011). The use of remote sensing in soil and terrain mapping: A review. Geoderma, 162, 1–19.CrossRefGoogle Scholar
  17. Nieto, J., Monteiro, S., & Viejo, D. (2010). 3D geological modelling using laser and hyperspectral data. In IEEE international geoscience and remote sensing symposium (IGARSS) (pp. 1–7). Honolulu, Hawaii.Google Scholar
  18. Nissimov, S., Goldberger, J., & Alchanatis, V. (2015). Obstacle detection in a greenhouse environment using the Kinect sensor. Computers and Electronics in Agriculture, 113, 104–115.CrossRefGoogle Scholar
  19. Page, E. S. (1954). Continuous inspection schemes. Biometrika, 41(1–2), 100–115.CrossRefGoogle Scholar
  20. Rangel, J., Soldan, S., & Kroll, A. (2014). 3D thermal imaging: Fusion of thermography and depth cameras. In 12th international conference on quantitative infrared thermography, NDT open access database (pp. 1–10).Google Scholar
  21. Reina, G., & Milella, A. (2012). Towards autonomous agriculture: Automatic ground detection using trinocular stereovision. Sensors, 12, 12405–12423.CrossRefGoogle Scholar
  22. Reina, G., Milella, A., & Galati, R. (2017). Terrain assessment for precision agriculture using vehicle dynamic modelling. Biosystems Engineering, 162, 124–139.CrossRefGoogle Scholar
  23. Ross, P., English, A., Ball, D., Upcroft, B., & Corke, P. (2015). Online novelty-based visual obstacle detection for field robotics. In IEEE international conference on robotics and automation (ICRA), Seattle, Washington (pp. 3935–3940).Google Scholar
  24. Rovira-Más, F., Zhang, Q., & Reid, J. F. (2008). Stereo vision three-dimensional terrain maps for precision agriculture. Computers and Electronics in Agriculture, 60(2), 133–143.CrossRefGoogle Scholar
  25. Underwood, J., Wendel, A., Schofield, B., McMurray, L., & Kimber, R. (2017). Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle. Journal of Field Robotics, 34, 1064–1083.CrossRefGoogle Scholar
  26. Wu, C., Niu, Z., Tang, Q., & Huang, W. (2008). Estimating chlorophyll content from hyperspectral vegetation indices: Modeling and validation. Agricultural and Forest Meteorology, 148(8–9), 1230–1241.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Institute of Intelligent Industrial Technologies and Systems for Advanced Manufacturing, National Research CouncilBariItaly
  2. 2.Department of Engineering for InnovationUniversity of SalentoLecceItaly
  3. 3.Danish Technological Institute (DTI)OdenseDenmark

Personalised recommendations