Advertisement

Mobile System for Determining Geographical Coordinates for Needs of Air Support in Cases of GPS Signals Loss

  • Karol JędrasiakEmail author
  • Aleksander Nawrat
  • Przemysław Recha
  • Dawid Sobel
Conference paper
  • 87 Downloads
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1129)

Abstract

The article presents the concept of implementing a mobile system for determining the geographical location of the observed point using a network of unmanned flying objects tethered, called virtual masts. The presented concept was developed taking into account the possible loss of GPS signal and the related need to efficiently determine the geographical location of the observed points, e.g. for emergency purposes. As part of the publication, the current status of work on the system was presented: currently a unique unmanned tethered flying platform called a virtual mast has been successfully implemented, which is capable of controlled hovering of unlimited length at a height of 100 m and observing the environment using high quality stabilized cameras. The second part of this publication presents the concept of determining the location of objects of interest using algorithms in the field of computer vision. The concept was tested at the time in a simulation environment. Promising results were obtained that will form the basis for further work on the mobile system of virtual masts for the needs of the Border Guard.

Keywords

Virtual mast Unmanned aerial vehicle Positioning 

Notes

Acknowledgment

This work has been supported by National Centre for Research and Development as a project ID: DOBBIO9/24/02/2018 “The Virtual Interactive Center for Improving the Professional Competences of Border Guard Officers”.

References

  1. 1.
    Keidar, R., Cohen, S.: U.S. Patent No. 8,590,829. U.S. Patent and Trademark Office, Washington, DC (2013)Google Scholar
  2. 2.
    Shachor, G., Cohen, S., Keidar, R.: U.S. Patent No. 8,695,919. U.S. Patent and Trademark Office, Washington, DC (2014)Google Scholar
  3. 3.
    Shachor, G., Cohen, S., Keidar, R., Yaniv, Z.: U.S. Patent No. 9,056,687. U.S. Patent and Trademark Office, Washington, DC (2015)Google Scholar
  4. 4.
  5. 5.
    Sobel, D., Jedrasiak, K., Daniec, K., Wrona, J., Nawrat, A.: Camera calibration for tracked vehicles augmented reality applications. In: Nawrat, M.A. (ed.) Innovative Control Systems for Tracked Vehicle Platforms. Springer, Cham (2014)Google Scholar
  6. 6.
    Official OpenCv. http://opencv.org/. Accessed July 2019
  7. 7.
    Jedrasiak, K., Andrzejczak, M., Nawrat, A.: SETh: the method for long-term object tracking. In: International Conference on Computer Vision and Graphics, pp. 302–315. Springer, Cham (2014)Google Scholar
  8. 8.
    Nawrat, A., Jędrasiak, K.: Fast colour recognition algorithm for robotics. Problemy Eksploatacji 3, 69–76 (2008) Google Scholar
  9. 9.
    Bieda, R., Jaskot, K., Jędrasiak, K., Nawrat, A.: Recognition and location of objects in the visual field of a UAV vision system. In: Nawrat, A., Kuś, Z. (eds.) Vision Based Systems for UAV Applications, pp. 27–45. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  10. 10.
    Jȩdrasiak, K., Nawrat, A.: Image recognition technique for unmanned aerial vehicles. In: International Conference on Computer Vision and Graphics, pp. 391–399. Springer, Heidelberg (2008)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Karol Jędrasiak
    • 1
    Email author
  • Aleksander Nawrat
    • 2
  • Przemysław Recha
    • 2
  • Dawid Sobel
    • 2
  1. 1.The University of Dabrowa GorniczaDąbrowa GórniczaPoland
  2. 2.Silesian University of Technology, Institute of Automatic ControlGliwicePoland

Personalised recommendations