Image-Based Localization for Facilitating Construction Field Reporting on Mobile Devices

  • Youyi FengEmail author
  • Mani Golparvar-Fard
Conference paper


Current studies reveal the exceptional advantages of high-efficiency onsite information management for facilitating the design/building progress of construction projects. In particular, the prevailing methods of construction field reporting still primarily rely on manual on-site documentation of project information. Fortunately, state of the art computerized technologies provides solutions with great potential for boosting the efficiency of gathering and managing on-site information for field reporting. Providing on-demand access to such information in real-time requires an autonomous method for localizing and tracking (i.e. calculating the position and orientation) of a construction filed reporter on job site. This, in fact, will reduce both working time and efforts for providing on-demand access to project information. Mobile devices such as smartphones/tablets can be utilized to enable the on-site personnel to manage the project information in a portable fashion while adopting cloud technology for instant online access. In this paper, we proposed a method for on-site localization that can estimate and track the position and orientation of a hand-held device in a near real-time manner. The developed method is infrastructure-independent and marker-less. The proposed method mainly consists of mapping, localization and alignment modules. Initially, a video stream is acquired using the built-in camera of the mobile device scanning the target building. A 3D point cloud is then reconstructed from the acquired video data. Afterward, the localization algorithm outputs the location/orientation of the queried images using feature-based matching with the base 3D point cloud map generated earlier. Finally, global localization of frames is estimated by using the alignment parameters of the 3D point cloud with a Geo-referenced BIM model to transform the localized frames to the global reference. The proposed solution enables the field reporter to access, retrieve, save and edit the project information more efficiently on the construction site.


Field reporting Localization SLAM Point cloud BIM 


  1. 1.
    Golparvar-Fard, M., Peña-Mora, F., Savarese, S.: Application of D4AR—A 4-dimensional augmented reality model for automating construction progress monitoring data collection, processing and communication. ITcon 14, 129–153 (2009)Google Scholar
  2. 2.
    Golparvar-Fard, M., Peña-Mora, F., Savarese, S.: Integrated sequential as-built and as-planned representation with D4AR tools in support of decision-making tasks in the AEC/FM industry. J. Constr. Eng. Manage. 137, 1099–1116 (2011)CrossRefGoogle Scholar
  3. 3.
    Bae, H., Golparvar-Fard, M., White, J.: Image-based localization and content authoring in structure-from-motion point cloud models for real-time field reporting applications. J. Comput. Civ. Eng. 29(4), B4014008 (2014)CrossRefGoogle Scholar
  4. 4.
    Leonard, J.J., Durrant-Whyte, H.F.: Mobile robot localization by tracking geometric beacons. IEEE Trans. Robot. Autom. 7(3), 376–382 (1991)CrossRefGoogle Scholar
  5. 5.
    Smith, R., Self, M., Cheeseman, P.: Estimating uncertain spatial relationships in robotics. Autonomous Robot Vehicles, pp. 167–193. Springer, New York (1990)CrossRefGoogle Scholar
  6. 6.
    Thrun, S., Leonard, J.J.: Simultaneous localization and mapping. In: Springer Handbook of Robotics, pp. 871–889 (2008)CrossRefGoogle Scholar
  7. 7.
    Mur-Artal, R., Tardós, J.D.: Fast relocalization and loop closing in keyframe-based SLAM. In: IEEE International Conference on Robotics and Automation (ICRA), May 2014, pp. 846–853 (2014)Google Scholar
  8. 8.
    Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Rob. 31(5), 1147–1163 (2015)CrossRefGoogle Scholar
  9. 9.
    Mur-Artal, R., Tardós, J.D.: Orb-slam2: An open-source slam system for monocular, stereo, and RGB-D cameras. IEEE Trans. Rob. 33(5), 1255–1262 (2017)CrossRefGoogle Scholar
  10. 10.
    Roberto, R.A., Lima, J.P., Araujo, T., Teichrieb, V.: Evaluation of motion tracking and depth sensing accuracy of the tango tablet. In: 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), pp. 231–234 (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Civil and Environmental EngineeringUniversity of Illinois at Urbana-ChampaignUrbanaUSA

Personalised recommendations