Abstract
Presented paper deals with the fusion of information obtained from different kinds of sensors placed on autonomous mobile robot. The method is based on Bayesian network. Implementation details and verification simulation experiment that fuses three different means to determine robot orientation are given in the paper. The method is easily extendable for higher number of sensors of different kind together with higher dimension output data to be fused.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Singhal, A., Brown, C.: Dynamic Bayes Net Approach to Multimodal Sensor Fusion. In: Proceedings of the SPIE - The International Society for Optical Engineering, pp. 2–10 (1997)
Jensen, F.V.: Bayesian networks and Decision Graphs. In: Statistics for Engineering and Information Science. Springer, Heidelberg (2001)
Murphy, K.P.: Dynamic Bayesian Networks: Representation, Inference and Learning, PhDÂ thesis, UC Berkley, Computer Science Division (2002)
Jensen, F.: An Introduction to Bayesian Networks. UCL Press (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Věchet, S., Krejsa, J. (2010). Sensors Data Fusion via Bayesian Network. In: Brezina, T., Jablonski, R. (eds) Recent Advances in Mechatronics. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05022-0_38
Download citation
DOI: https://doi.org/10.1007/978-3-642-05022-0_38
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-05021-3
Online ISBN: 978-3-642-05022-0
eBook Packages: EngineeringEngineering (R0)