Sensor Data Fusion Framework to Improve Holographic Object Registration Accuracy for a Shared Augmented Reality Mission Planning Scenario
Accurate 3D holographic object registration for a shared augmented reality application is a challenging proposition with Microsoft HoloLens. We investigated using a sensor data fusion framework which uses both sensor data from an external positional tracking system and the Microsoft HoloLens to reduce augmented reality registration errors. In our setup, positional tracking data from the OptiTrack motion capture system was used to improve the registration of the 3D holographic object for a shared Augmented Reality application running on three Microsoft HoloLens displays. We showed an improved and more accurate 3D holographic object registration in our shared Augmented Reality application compared to the shared augmented reality application using HoloToolkit Sharing Service released by Microsoft. The result of our comparative study of the two applications also showed participants’ responses consistent with our initial assessment on the improved registration accuracy using our sensor data fusion framework. Using our sensor data fusion framework, we developed a shared augmented reality application to support a mission planning scenario using multiple holographic displays to illustrate details of the mission.
KeywordsShared augmented reality Sensor fusion
This work was supported in part by the DOD High Performance Computing Modernization Program at The Army Research Laboratory (ARL), Department of Defense Supercomputing Resource Center (DSRC).
- 2.Billinghurst, M., Poupyrev, I., Kato, H., May, R.: Mixing realities in shared space: an augmented reality interface for collaborative computing. In: 2000 IEEE International Conference on Multimedia and Expo, ICME 2000. Proceedings, Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532), vol. 3, pp. 1641–1644 (2000). https://doi.org/10.1109/icme.2000.871085
- 4.Zhou, F., Duh, H.B.L., Billinghurst, M.: Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR. In: 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, pp. 193–202, September 2008. https://doi.org/10.1109/ismar.2008.4637362
- 5.Arth, C., Mulloni, A., Schmalstieg, D.: Exploiting sensors on mobile phones to improve wide-area localization. In: Proceedings of the 21st International Conference on Pattern Recognition (ICPR 2012), pp. 2152–2156, November 2012Google Scholar
- 6.State, A., Hirota, G., Chen, D.T., Garrett, W.F., Livingston, M.A.: Superior augmented reality registration by integrating landmark tracking and magnetic tracking. In: Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1996, pp. 429–438. ACM, New York (1996). https://doi.org/10.1145/237170.237282
- 7.You, S., Neumann, U., Azuma, R.: Hybrid inertial and vision tracking for augmented reality registration. In Proceedings IEEE Virtual Reality (Cat. No. 99CB36316), pp. 260–267, March 1999. https://doi.org/10.1109/vr.1999.756960.author
- 8.Bravo, L., Kim, D., Ham, F., Su, S.: Computational study of atomization and fuel drop size distributions in high-speed primary breakup. J. At. Sprays (2017)Google Scholar