AR Cultural Heritage Reconstruction Based on Feature Landmark Database Constructed by Using Omnidirectional Range Sensor
This paper describes an application of augmented reality (AR) techniques to virtual cultural heritage reconstruction on the real sites of defunct constructs. To realize AR-based cultural heritage reconstruction, extrinsic camera parameter estimation is required for geometric registration of real and virtual worlds. To estimate extrinsic camera parameters, we use a pre-constructed feature landmark database of the target environment. Conventionally, a feature landmark database has been constructed in a large-scale environment using a structure -from-motion technique for omnidirectional image sequences. However, the accuracy of estimated camera parameters is insufficient for specific applications like AR-based cultural heritage reconstruction, which needs to overlay CG objects at the position close to the user’s viewpoint. This is due to the difficulty in compensation of the appearance change of close landmarks only from the sparse 3-D information obtained by structure-from-motion. In this paper, visual patterns of landmarks are compensated for by considering local shapes obtained by omnidirectional range finder to find corresponding landmarks existing close to the user. By using these landmarks with local shapes, accurate geometric registration is achieved for AR sightseeing in historic sites.
KeywordsCultural Heritage Augmented Reality Camera Parameter Target Environment Image Template
Unable to display preview. Download preview PDF.
- 1.Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proc. Int. Symp. on Mixed and Augmented Reality, pp. 225–234 (2007)Google Scholar
- 2.Williams, B., Klein, G., Reid, I.: Real-time SLAM relocalisation. In: Proc. Int. Conf. on Computer Vision (2007)Google Scholar
- 3.Lepetit, V., Vacchetti, L., Thalmann, D., Fua, P.: Fully automated and stable registration for augmented reality applications. In: Proc. Int. Symp. on Mixed and Augmented Reality, pp. 93–102 (2003)Google Scholar
- 4.Vacchetti, L., Lepetit, V., Fua, P.: Combining edge and texture information for real-time accurate 3D camera tracking. In: Proc. Int. Symp. on Mixed and Augmented Reality, pp. 48–57 (2004)Google Scholar
- 5.Yang, G., Becker, J., Stewart, C.V.: Estimating the location of a camera with respect to a 3d model. In: Proc. Int. Conf. on 3-D Digital Imaging and Modeling, pp. 159–166 (2007)Google Scholar
- 6.Taketomi, T., Sato, T., Yokoya, N.: Real-time camera position and posture estimation using a feature landmark database with priorities. In: CD-ROM Proc. 19th IAPR Int. Conf. on Pattern Recognition (2008)Google Scholar
- 7.Harris, C., Stephens, M.: A combined corner and edge detector. In: Proc. Alvey Vision Conf., pp. 147–151 (1988)Google Scholar
- 8.Susuki, M., Nakagawa, T., Sato, T., Yokoya, N.: Extrinsic camera parameter estimation from a still image based on feature landmark database. In: Proc. ACCV 2007 Satellite Workshop on Multi-dimensional and Multi-view Image Processing, pp. 124–129 (2007)Google Scholar
- 9.Klette, R., Schluns, K., Koschan, A. (eds.): Computer Vision: Three-dimensional Data from Image (1998)Google Scholar