Abstract
In this paper we present an approach for planning the next best sensor action based on the actual knowledge about a scene and based on a given task to be executed. The actual knowledge about the world is modeled via a 3d grid. The actual state of a voxel is represented by two independent fuzzy membership functions indicating the degree of being occupied by an object or being free. The need of information about the scene in order to execute a given task can be formulated in terms of these two fuzzy membership functions. The next view is chosen to scan as many voxels as possible to decrease the amount of information still needed by some system. The planning system easily can be adapted to nearly any range camera and has been evaluated in real world scenes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Abidi, M., Gonzalez, R. Data Fusion in Robotics and Machine Intelligence, Academic Press, 1992
Banta, J., Zhen, Y. Wang, X. Zhang, G. Smith, M. Abidi, M., A “best-next-view” algorithm for three-dimensional scene reconstruction using range cameras, Intel. Robotics and Comp. Vision XIV session of Intel. Sys. and Advanced Manufacturing Symp. SPIE, 1995
Connolly, C., The determination of next best views, Proc. Int. Conf. Robotics and Automation 1985, pp 432–435
Elfes, A., Occupancy grids,: A stochastic spatial representation for active robot perception, Proc. Sixth Conf. on Uncertainty in AI, July 1990
Kruse, E., Gutsche, R. Wahl, F., Efficient, Iterative, Sensor based 3-D Map Building Using Rating Functions in Configuration Space, Proc. Int. Conf. Robotics and Automation 1996
Kruse, R., Gebhardt, J., Klawonn, F., Foundations of Fuzzy-Systems, Wiley and Sons, Chichester, 1994
Mayer, J., Bajcsy, R., Occlusions as a guide for planning the next view, PAMI, 15 (5), pp 417–433, 1993
Pito, R.: A Sensor-Based Solution to the “Next Best View” Problem, Proc. Int. Conf. Pattern Recognition, Vienna, pp 941–945, 1996
Roehrdanz, F, Wahl, F., Incremental Free Space Acquisition and Representation for Automatic Grasping, Proc. Fourth Int. Conf. on Automation, Robotics and Computer Vision, ICARCV’96, November 1996
Stahs, T., Wahl, F., Fast and Versatile Range Data Acquisition in a Robot Work Cell, Proc. Int. Conf. on Intelligent Robots and Systems, pp 1169–1174, 1992
Tarabanis, A., Allen, P., Tsai, R.: A Survey of Sensor Planning in Computer Vision, Trans. Robotics a. Automation, Vol 11, NO 1, pp 86–104, Feb 1995
Whaite, P., Ferrie, F., Autonomous exploration: Driven by Uncertainty, Proc. CVPR’94, pp 339–346, 1994
Wahl, F., A Coded Light Approach for Depth Map Acquisition, Proc. 8. DAGM Symposium Paderborn, Springer Verlag, 1986
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Korn, B., Krebs, B., Wahl, F.M. (1997). Sensor Based View Planning Using Vague Scene Representation. In: Paulus, E., Wahl, F.M. (eds) Mustererkennung 1997. Informatik aktuell. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-60893-3_17
Download citation
DOI: https://doi.org/10.1007/978-3-642-60893-3_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-63426-3
Online ISBN: 978-3-642-60893-3
eBook Packages: Springer Book Archive