Abstract
For smart and autonomous systems, 3D positioning and measurement is essential as the precision can severely affect the applicability of the techniques for a number of applications. In this paper, we summarize and compare different techniques and sensors that can be potentially used in multimodal data analysis and integration. These will provide useful guidance for the design and implementation of relevant systems.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Han, J., et al.: Enhanced computer vision with Microsoft Kinect sensor: a review. IEEE Trans. Cybern. 43(5), 1318–1334 (2013)
Fanello, S.R., et al.: HyperDepth: learning depth from structured light without matching. In: Proceedings of the IEEE Conference on CVPR, pp. 5441–5450 (2016)
Zanuttigh, P., et al.: Time-of-Flight and Structured Light Depth Cameras. Springer, Heidelberg (2016)
Ke, F., et al.: A flexible and high precision calibration method for the structured light vision system. Optik-Int. J. Light Electron Opt. 127(1), 310–314 (2016)
Ren, M., et al.: Novel projector calibration method for monocular structured light system based on digital image correlation. Optik 132, 337–347 (2017)
Ghamisi, P., et al.: LiDAR data classification using extinction profiles and a composite Kernel support vector machine. IEEE Geosci. Remote Sens. Lett. 14, 659–663 (2017)
Fersch, T., et al.: A CDMA modulation technique for automotive time-of-flight LiDAR systems. IEEE Sens. J. 17, 3507–3516 (2017)
Kang, Z., et al.: A bayesian-network-based classification method integrating airborne LiDAR data with optical images. IEEE J-STARS 10, 1651–1661 (2016)
Altmann, Y., et al.: Robust spectral unmixing of sparse multispectral Lidar waveforms using gamma Markov random fields. arXiv preprint arXiv:1610.04107 (2016)
Martín, A.J., et al.: EMFi-based ultrasonic sensory array for 3D localization of reflectors using positioning algorithms. IEEE Sens. J. 15(5), 2951–2962 (2016)
Paajanen, M., et al.: ElectroMechanical Film (EMFi)—a new multipurpose electret material. Sens. Actuators A: Phys. 84(1), 95–102 (2000)
Khyam, M.O., Pickering, M.R., et al.: Pseudo-orthogonal chirp-based multiple ultrasonic transducer positioning. IEEE Sens. J. 17, 3832–3843 (2017)
Khyam, M.O., et al.: High-precision OFDM-based multiple ultrasonic transducer positioning using a robust optimization approach. IEEE Sens. J. 16(13), 5325–5336 (2016)
Chen, C., et al.: Real-time human action recognition based on depth motion maps. J. Real-time Image Process. 12(1), 155–163 (2016)
Corti, A., et al.: A metrological characterization of the Kinect V2 time-of-flight camera. Robot. Auton. Syst. 75, 584–594 (2016)
Supancic, J.S., et al.: Depth-based hand pose estimation: data, methods, and challenges. In: Proceedings of the IEEE International Conference on CV, pp. 1868–1876 (2016)
Das, R., et al.: GeroSim: a simulation framework for gesture driven robotic arm control using Intel RealSense. In: IEEE International Conference on ICPEICES, pp. 1–5, 4 July 2016
Lan, Y., et al.: Data fusion-based real-time hand gesture recognition with Kinect V2. In: 9th International Conference on Human System Interactions (HSI). IEEE (2016)
Chen, L., et al.: A survey of human motion analysis using depth imagery. Pattern Recogn. Lett. 34(15), 1995–2006 (2013)
Allodi, M., et al.: Machine learning in tracking associations with stereo vision and lidar observations for an autonomous vehicle. In: Intelligent Vehicles Symposium. IEEE (2016)
Yao, Y., et al.: Integration of indoor and outdoor positioning in a three-dimension scene based on LIDAR and GPS signal. In: Proceedings of the 2nd IEEE ICCC Conference, pp. 1772–1776 (2016)
Nakajima, K., et al.: 3D environment mapping and self-position estimation by a small flying robot mounted with a movable ultrasonic range sensor. J. Elect. Sys. Inf. Tech. 4, 289–298 (2017)
Anghel, A., et al.: Combining spaceborne SAR images with 3D point clouds for infrastructure monitoring applications. ISPRS J. Photogramm. Remote Sens. 111, 45–61 (2016)
Raucoules, D., et al.: Time-variable 3D ground displacements from high-resolution synthetic aperture radar (SAR). Remote Sens. Environ. 139, 198–204 (2013)
Nitti, D.O., et al.: Feasibility of using synthetic aperture radar to aid UAV navigation. Sensors 15(8), 18334–18359 (2015)
Penner, J.F., et al.: Ground-based 3D radar imaging of trees using a 2D synthetic aperture. Electronics 6(1), 11 (2017)
Basaca-Preciado, L.C., et al.: Optical 3D laser measurement system for navigation of autonomous mobile robot. Opt. Lasers Eng. 54, 159–169 (2015)
Wu, Z., et al.: A novel stereo positioning method based on optical and SAR sensor. In: IEEE International Geoscience and Remote Sensing Symposium (IGARSS), pp. 339–342. IEEE (2014)
Ren, J., et al.: A general framework for 3D soccer ball estimation and tracking. In: 2004 International Conference on Image Processing, ICIP 2004, vol. 3, pp. 1935–1938 (2004)
Ren, J., et al.: Tracking the soccer ball using multiple fixed cameras. Comput. Vis. Image Underst. 113(5), 633–642 (2009)
Feng, Y., et al.: Object-based 2D-to-3D video conversion for effective stereoscopic content generation in 3D-TV applications. IEEE Trans. Broadcast. 57(2), 500–509 (2011)
Ren, J., et al.: Real-time modeling of 3-D soccer ball trajectories from multiple fixed cameras. IEEE Trans. Circ. Syst. Video Technol. 18(3), 350–362 (2008)
Ren, J., et al.: Multi-camera video surveillance for real-time analysis and reconstruction of soccer games. Mach. Vis. Appl. 21(6), 855–863 (2010)
Liu, Z., et al.: Template deformation based 3D reconstruction of full human body scans from low-cost depth cameras. IEEE Trans. Cybern. 47(3), 695–708 (2017)
Ren, J., et al.: Fusion of intensity and inter-component chromatic difference for effective and robust colour edge detection. IET Image Process. 4(4), 294–301 (2010)
Acknowledgement
This work was supported by the National Natural Science Foundation of China (61672008), Guangdong Provincial Application-oriented Technical Research and Development Special fund project (2016B010127006, 2015B010131017), the Natural Science Foundation of Guangdong Province (2016A030311013, 2015A030313672), and International Scientific and Technological Cooperation Projects of Education Department of Guangdong Province (2015KGJHZ021).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Fang, Z. et al. (2019). 3D Sensing Techniques for Multimodal Data Analysis and Integration in Smart and Autonomous Systems. In: Liang, Q., Mu, J., Jia, M., Wang, W., Feng, X., Zhang, B. (eds) Communications, Signal Processing, and Systems. CSPS 2017. Lecture Notes in Electrical Engineering, vol 463. Springer, Singapore. https://doi.org/10.1007/978-981-10-6571-2_71
Download citation
DOI: https://doi.org/10.1007/978-981-10-6571-2_71
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-6570-5
Online ISBN: 978-981-10-6571-2
eBook Packages: EngineeringEngineering (R0)