Skip to main content

Vision Ship Information Overlay and Navigation “VISION” System

  • Conference paper
  • First Online:
Advances in Human Factors in Robots and Unmanned Systems (AHFE 2019)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 962))

Included in the following conference series:

  • 1091 Accesses

Abstract

Modern naval vessels, marvels of systems engineering, combine a myriad of complex solutions into a single, sophisticated machine. Personnel responsible for the safe operation of these ships are required to parse, filter, and process a large array of information in order to make key decisions. Safe navigation requires better processing and interpretation of significant quantities of data, and ensuring this information is made available in an easily consumable fashion.

Augmented reality provides a means to solve this problem, using customizable interfaces and overlays of information on the world to help make intelligent decisions in dangerous, congested situations. The team’s Optimal Trajectory and other path planning algorithms can refine these navigational aids, accounting for changes in weather, current, and tides.

Our team successfully demonstrated an augmented reality solution, giving additional situational awareness during transit to sea and into port.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kim, M., Yi, S., Jung, D., Park, S., Seo, D.: Augmented-reality visualization of aerodynamics simulation in sustainable cloud computing. Sustain. 10, 1362 (2018)

    Article  Google Scholar 

  2. Wu, M., Chien, J., Wu, C., Lee, J.: An augmented reality system using improved-iterative closest point algorithm for on-patient medical image visualization. Sens., 1–13 (2018)

    Google Scholar 

  3. Aricò, P., et al.: Human-machine interaction assessment by neurophysiological measures: a study on professional air traffic controllers. In: EBMC 2018, 40th International Engineering in Medicine and Biology Conference, Honolulu (2018)

    Google Scholar 

  4. Baumeister, J., et al.: Cognitive cost of using augmented reality displays. IEEE Trans. on Vis. Comp. Graph. 23, 2378–2388 (2017)

    Article  Google Scholar 

  5. Robbins, et al.: Compact optical system with mems scanners for image generation and object tracking. US Patent 10,175,489 B1, 8 Jan 2019

    Google Scholar 

  6. Bloomberg LP. https://www.bloomberg.com/news/articles/2018-11-28/microsoft-wins-480-million-army-battlefield-contract

  7. Grabowski, M.: Research on wearable, immersive augmented reality (WIAR) adoption in maritime navigation. J. Navig. 68, 453–464 (2015)

    Article  Google Scholar 

  8. VEAT. https://fortress.wa.gov/ecy/publications/documents/1508012.pdf

  9. Microsoft. https://www.microsoft.com/en-us/hololens/legal/health-and-safety-information

  10. Popular Mechanics. https://www.popularmechanics.com/technology/gadgets/a15324/how-microsofts-hololens-works/

  11. Microsoft. https://docs.microsoft.com/en-us/windows/mixed-reality/device-portal-api-reference

  12. Garon, M., Boulet, P., Doiron, J., Beaulieu, L., Lalonde, J.: Real-time high resolution 3D data on the hololens. In: IEEE ISMAR, 1–3 (2016)

    Google Scholar 

  13. Qian, L., Plopski, A., Navab, N., Kazanzides, P.: Restoring the awareness in the occluded visual field for optical see-through head-mounted displays. IEEE Trans. Vis. Comp. Graph. 24, 2936–2946 (2018)

    Article  Google Scholar 

  14. Xiao, R., Schwarz, J., Throm, N., Wilson, A., Benko, H.: MRTouch: adding touch input to head-mounted mixed reality. IEEE Trans. Vis. Comp. Graph. 24, 1653–1660 (2018)

    Article  Google Scholar 

  15. Robinson, D.R., Mar, R.T., Estabridis, K., Hewer, G.: An efficient algorithm for optimal trajectory generations for heterogeneous multi-agent systems in non-convex environments. IEEE Robot Auto. 3, 1215–1222 (2018)

    Article  Google Scholar 

  16. Skjetne, R., Smogeli, O., Fossen, T.: A nonlinear ship manoeuvering model: identification and adaptive control with experiments for a ship model. Mod. Ident. Contro. 25, 3–27 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jessica Reichers .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Reichers, J., Brannon, N., Rubini, J., Hillis, N., Estabridis, K., Hewer, G. (2020). Vision Ship Information Overlay and Navigation “VISION” System. In: Chen, J. (eds) Advances in Human Factors in Robots and Unmanned Systems. AHFE 2019. Advances in Intelligent Systems and Computing, vol 962. Springer, Cham. https://doi.org/10.1007/978-3-030-20467-9_1

Download citation

Publish with us

Policies and ethics