Abstract
Scene reconstruction, i.e. the process of creating a 3D representation (mesh) of some real world scene, has recently become easier with the advent of cheap RGB-D sensors (e.g. the Microsoft Kinect).
Many such sensors use rolling shutter cameras, which produce geometrically distorted images when they are moving. To mitigate these rolling shutter distortions we propose a method that uses an attached gyroscope to rectify the depth scans.We also present a simple scheme to calibrate the relative pose and time synchronization between the gyro and a rolling shutter RGB-D sensor.
For scene reconstruction we use the Kinect Fusion algorithm to produce meshes. We create meshes from both raw and rectified depth scans, and these are then compared to a ground truth mesh. The types of motion we investigate are: pan, tilt and wobble (shaking) motions.
As our method relies on gyroscope readings, the amount of computations required is negligible compared to the cost of running Kinect Fusion.
This chapter is an extension of a paper at the IEEE Workshop on Robot Vision [10]. Compared to that paper, we have improved the rectification to also correct for lens distortion, and use a coarse-to-fine search to find the time shift more quicky.We have extended our experiments to also investigate the effects of lens distortion, and to use more accurate ground truth. The experiments demonstrate that correction of rolling shutter effects yields a larger improvement of the 3D model than correction for lens distortion.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Baker, S., Bennett, E., Kang, S.B., Szeliski, R.: Removing rolling shutter wobble. In: IEEE Conference on Computer Vision and Pattern Recognition. IEEE Computer Society, San Francisco (2010)
Besl, P., McKay, H.: A method for registration of 3-D shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence 14(2), 239–256 (1992)
Geyer, C., Meingast, M., Sastry, S.: Geometric models of rolling-shutter cameras. In: 6th OmniVis WS (2005)
Golub, G.H., van Loan, C.F.: Matrix Computations. Johns Hopkins University Press, Baltimore (1983)
Hanning, G., Forslöw, N., Forssén, P.E., Ringaby, E., Törnqvist, D., Callmer, J.: Stabilizing cell phone video using inertial measurement sensors. In: The Second IEEE International Workshop on Mobile Vision. IEEE, Barcelona (2011)
Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press (2004)
Hol, J.D., Schön, T.B., Gustafsson, F.: Modeling and calibration of inertial and vision sensors. International Journal of Robotics Research 29(2), 231–244 (2010)
Karpenko, A., Jacobs, D., Baek, J., Levoy, M.: Digital video stabilization and rolling shutter correction using gyroscopes. Tech. Rep. CSTR 2011-03, Stanford University Computer Science (2011)
Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohli, P., Shotton, J., Hodges, S., Fitzgibbon, A.: Kinectfusion: Real-time dense surface mapping and tracking. In: IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011, Basel, Switzerland (2011)
Ovrén, H., Forssén, P.E., Törnqvist, D.: Why would i want a gyroscope on my RGB-D sensor? In: Proceedings of IEEE Winter Vision Meetings, Workshop on Robot Vision (WoRV 2013). IEEE, Clearwater (2013)
Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical recipes in C: the art of scientific computing, 2nd edn. Cambridge University Press, New York (1992)
Ringaby, E., Forssén, P.E.: Scan rectification for structured light range sensors with rolling shutters. In: IEEE International Conference on Computer Vision. IEEE Computer Society Press, Barcelona (2011)
Ringaby, E., Forssén, P.E.: Efficient video rectification and stabilisation for cell-phones. International Journal of Computer Vision 96(3), 335–352 (2012)
Roth, H., Vona, M.: Moving volume kinectfusion. In: British Machine Vision Conference (BMVC 2012). BMVA, University of Surrey, UK (2012), http://dx.doi.org/10.5244/C.26.112
Rusu, R.B., Cousins, S.: 3D is here: Point Cloud Library (PCL). In: IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China (2011)
Schönemann, P.: A generalized solution of the orthogonal procrustes problem. Psychometrika 31(1), 1–10 (1966)
Shoemake, K.: Animating rotation with quaternion curves. In: Int. Conf. on CGIT, pp. 245–254 (1985)
Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: Proc. of the International Conference on Intelligent Robot Systems, IROS (2012)
Whelan, T., McDonald, J., Kaess, M., Fallon, M., Johannsson, H., Leonard, J.J.: Kintinuous: Spatially extended kinectfusion. In: RSS 2012 Workshop on RGB-D Cameras, Sydney (2012)
Zhang, Z.: A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(11), 1330–1334 (2000)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Ovrén, H., Forssén, PE., Törnqvist, D. (2015). Improving RGB-D Scene Reconstruction Using Rolling Shutter Rectification. In: Sun, Y., Behal, A., Chung, CK. (eds) New Development in Robot Vision. Cognitive Systems Monographs, vol 23. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-43859-6_4
Download citation
DOI: https://doi.org/10.1007/978-3-662-43859-6_4
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-43858-9
Online ISBN: 978-3-662-43859-6
eBook Packages: EngineeringEngineering (R0)