A multi-sensor robotic platform for ground mapping and estimation beyond the visible spectrum
Accurate soil mapping is critical for a highly-automated agricultural vehicle to successfully accomplish important tasks including seeding, ploughing, fertilising and controlled traffic, with limited human supervision, ensuring at the same time high safety standards. In this research, a multi-sensor ground mapping and characterisation approach is proposed, whereby data coming from heterogeneous but complementary sensors, mounted on-board an unmanned rover, are combined to generate a multi-layer map of the environment and specifically of the supporting ground. The sensor suite comprises both exteroceptive and proprioceptive devices. Exteroceptive sensors include a stereo camera, a visible and near-infrared camera and a thermal imager. Proprioceptive data consist of the vertical acceleration of the vehicle sprung mass as acquired by an inertial measurement unit. The paper details the steps for the integration of the different sensor data into a unique multi-layer map and discusses a set of exteroceptive and proprioceptive features for soil characterisation and change detection. Experimental results obtained with an all-terrain vehicle operating on different ground surfaces are presented. It is shown that the proposed technologies could be potentially used to develop all-terrain self-driving systems in agriculture. In addition, multi-modal soil maps could be useful to feed farm management systems that would present to the user various soil layers incorporating colour, geometric, spectral and mechanical properties.
KeywordsAgricultural robotics Intelligent vehicles Soil mapping Multi-spectral sensing Vibration response
The financial support of the FP7 ERA-NET ICT-AGRI 2 through the grant Simultaneous Safety and Surveying for Collaborative Agricultural Vehicles (S3-CAV) (Id. 29839) is gratefully acknowledged. The authors would also like to thank the National Research Council (CNR), Italy, for supporting this work under the CNR 2016 Short Term Mobility (STM) program.
Annalisa Milella and Giulio Reina made significant contributions to the conception and design of the research. They mainly dealt with data analysis and interpretation, and writing of the manuscript. Michael Nielsen focused on the development of the multi-sensor system, the experimental activities and data analysis.
- Bouguet, J. Y. (2008). Camera calibration toolbox for Matlab. Retrieved November 16, 2017, from, http://www.vision.caltech.edu/bouguetj/calib_doc/.
- Dong, J., Burnham, J. G., Boots, B., Rains, G., & Dellaert, F. (2017). 4D crop monitoring: Spatio-temporal reconstruction for agriculture. In IEEE International Conference on Robotics and Automation (ICRA), Singapore (pp. 3878–3885).Google Scholar
- Fehlman, W. L., & Hinders, M. K. (2009). Mobile robot navigation with intelligent infrared image interpretation. London, UK: Springer. https://doi.org/10.1007/978-1-978-1-84882-509-3. ISBN 978-1-84882-508-6
- Geiger, A., Ziegler, J., & Stiller, C. (2011). StereoScan: Dense 3D reconstruction in real-time. In 2011 IEEE intelligent vehicles symposium (pp. 963–968).Google Scholar
- Hirschmuller, H. (2005). Accurate and efficient stereo processing by semi-global matching and mutual information. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05) (pp. 807–814).Google Scholar
- Krebs, A., Pradalier, C., & Siegwart, R. (2009). Comparison of boosting based terrain classification using proprioceptive and exteroceptive data. In O. Khatib, V. Kumar, & G. J. Pappas (Eds.), Experimental robotics. Springer tracts in advanced robotics (Vol. 54, pp. 93–102). Berlin, Germany: Springer.Google Scholar
- Milella, A., Nielsen, M., & Reina, G. (2017). Sensing in the visible spectrum and beyond for terrain estimation in precision agriculture. In J. A. Taylor, D. Cammarano, A. Prashar, A. Hamilton (Eds.), Proceedings of the 11th European Conference on Precision Agriculture (ECPA 2017). Advances in Animal Biosciences (Vol. 8. No. 2, pp. 423–429).Google Scholar
- Nieto, J., Monteiro, S., & Viejo, D. (2010). 3D geological modelling using laser and hyperspectral data. In IEEE international geoscience and remote sensing symposium (IGARSS) (pp. 1–7). Honolulu, Hawaii.Google Scholar
- Rangel, J., Soldan, S., & Kroll, A. (2014). 3D thermal imaging: Fusion of thermography and depth cameras. In 12th international conference on quantitative infrared thermography, NDT open access database (pp. 1–10).Google Scholar
- Ross, P., English, A., Ball, D., Upcroft, B., & Corke, P. (2015). Online novelty-based visual obstacle detection for field robotics. In IEEE international conference on robotics and automation (ICRA), Seattle, Washington (pp. 3935–3940).Google Scholar