Skip to main content

Technology Issues

  • Chapter
  • First Online:
Book cover Augmented Reality

Abstract

In this section, some of the key technical issues in building an augmented reality device are discussed. The number of sensors, and the technology associated with them, combined with the need to operate at low power and be lightweight, is an on-going challenge for the device builders.

Augmented reality systems (headsets, helmets, HUDs, etc.) must interact with our eyes and brain, and the eye-brain connection is a very powerful, amazingly complex and capable system.

Knowing where you are is one of the most critical functions in augmented reality. How can a system identify things and deliver potentially mission critical information in a timely manner if it doesn’t know where you are? But how does it know where you are?

The concept of voice control, like augmented and virtual reality, are terms that have been in our vocabulary for so long, many think they know what they are and how they work.

The use of hand gestures as a means of communicating and controlling the information provided by augmented reality systems provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI); hand gestures can help in achieving the ease and naturalness.

Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. Eye-tracking is an old concept developed in the 1800s made using direct observations.

If ever the statement, “one size does not fit all,” was appropriate, it would be in the case of a user interface.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The vestibulo-ocular reflex (VOR) is a reflex, where activation of the vestibular system causes eye movement. This reflex functions to stabilize images on the retinas during head movement by producing eye movements in the direction opposite to head movement, thus preserving the image on the center of the visual field(s).

  2. 2.

    Chromatic Aberration, also known as “color fringing” or “purple fringing”, is a common optical problem that occurs when a lens is either unable to bring all wavelengths of color to the same focal plane, and/or when wavelengths of color are focused at different positions in the focal plane.

  3. 3.

    FLCOS was contemplated for holography due to its high speed, but never implemented because the pixels were too big. To display a holographic interference pattern, you need large arrays of very small pixels.

  4. 4.

    Nit (nt) is a unit of luminance. It is a non-SI name used for candela per square meter (1 nt = 1 cd/m2). The candela per square meter (cd/m2) is the derived SI unit of luminance. The unit is based on the candela, the SI unit of luminous intensity, and the square meter, the SI unit of area.

  5. 5.

    In optics, the exit pupil is a virtual aperture in an optical system. Only rays which pass through this virtual aperture can exit the system.

References

  1. Ali, M. A., & Klyne, M. A. (1985). Vision in vertebrates. New York: Plenum Press. ISBN:0-306-42065-1.

    Google Scholar 

  2. Womelsdorf, T., et al. (2006). Dynamic shifts of visual receptive fields in cortical area MT by spatial attention. Nature Neuroscience, 9, 1156–1160. 105.

    Article  Google Scholar 

  3. Westheimer, G., & McKee, S. P. (1978). Stereoscopic acuity for moving retinal images. Journal of the Optical Society of America, 68(4), 450–455.

    Article  Google Scholar 

  4. Optical resolution. https://en.wikipedia.org/wiki/Optical_resolution

  5. Prince, S. (2010). Through the looking glass: Philosophical toys and digital visual effects. Projections, 4.

    Google Scholar 

  6. Wertheimer, M. (1912, April). Experimentelle Studien über das Sehen von Bewegung [Experimental Studies on Motion Vision] (PDF). Zeitschrift für Psychologie., 61(1), 161–265.

    Google Scholar 

  7. Padmos, P., & Milders, M. V. (1992). Quality criteria for simulator images: A literature review. Human Factors, 34(6), 727–748.

    Article  Google Scholar 

  8. Pasman, W., van der Schaaf, A., Lagendijk, R. L., & Jansen, F. W. 1999, December 6. Low latency rendering for mobile augmented reality. Computers & Graphics, 23, 875–881 (international journal/conference paper), Ubicom-Publication.

    Google Scholar 

  9. Amin, M. S., & Meyers, A. D. (2012, February). Vestibuloocular reflex testing. http://emedicine.medscape.com/article/1836134-overview

  10. Bailey, R. E., Arthur III, J. J. (Trey), & Williams, S. P.. Latency requirements for head-worn display S/EVS applications, NASA Langley Research Center, Hampton. https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20120009198.pdf

  11. Lincoln, P., Blate, A., Singh, M., Whitted, T., State, A., Lastra, A., & Fuchs, H. (2016, April). From motion to photons in 80 Microseconds: Towards minimal latency for virtual and augmented reality. IEEE Transactions on Visualization and Computer Graphics, 22(4), 1367–1376.

    Article  Google Scholar 

  12. Slotten, H. R. (2000). Radio and television regulation: Broadcast technology in the United States 1920–1960. JHU Press. ISBN:0-8018-6450-X. “C.B.S. Color Video Starts Nov. 20; Adapters Needed by Present Sets”, New York Times, Oct. 12, 1950, p. 1.

    Google Scholar 

  13. Hoffman, D. M., Girshick, A. R., Akeley, K., & Banks, M. S. (2008, March). Vergence–accommodation conflicts Hinder visual performance and cause visual fatigue. Journal of Vision, Research Article, 8, 33.

    Google Scholar 

  14. http://www.kguttag.com/

  15. Zheng, F., Whitted, T., Lastra, A., Lincoln, P., State, A., Maimonek, A., & Fuchs, H. 2014. Minimizing latency for augmented reality displays: Frames considered harmful. IEEE International Symposium on Mixed and Augmented Reality. p. 195

    Google Scholar 

  16. Rolland, J. P., & Fuchs, H. (2000). Optical versus video see-through head mounted displays. Presence: Teleoperators and Virtual Environments, 9(3), 287–309.

    Article  Google Scholar 

  17. Holloway, R. L. (1997). Registration error analysis for augmented reality. Presence, 6(4), 413–432.

    Article  Google Scholar 

  18. Barfield, W. (Ed.). (2016). Fundamentals of wearable computers and augmented reality (2nd ed.). Boca Raton: CRC Press/Taylor & Francis Group.

    Google Scholar 

  19. Karim, M. A. (1992). Electro-optical displays. New York: Marcel Dekker, Inc. ISBN:0.8247-8695-5.

    Google Scholar 

  20. https://en.wikipedia.org/wiki/Human_eye

  21. Murray, W. S.. (2003, August). The eye-movement engine. Behavioral and Brain Sciences, 26(04), 446–495. Cambridge University Press.

    Google Scholar 

  22. https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment

  23. Comeau, C. P., & Bryan, J. S. (1961, November 10). Headsight television system provides remote surveillance. Electronics, 34, 86–90.

    Google Scholar 

  24. Bionic eye will send images direct to the brain to restore sight. New Scientist. https://www.newscientist.com/article/mg22830521-700-bionic-eye-will-send-images-direct-to-the-brain-to-restore-sight/

  25. Zrenner, E., et al. (2010). Subretinal electronic chips allow blind patients to read letters and combine them to words. Proceedings of the Royal Society B. doi:10.1098/rspb.2010.1747.

    Google Scholar 

  26. Brody, T. P. (1997). Birth of the active matrix. Information Display, 13(10), 28–32.

    Google Scholar 

  27. Armitage, D., et al. (2006). Introduction to microdisplays. Chichester: Wiley. ISBN:978-0-470-85-281-1.

    Google Scholar 

  28. IBM. (1998). Special session for high-resolution displays. IBM Journel of Research and Development, 42(3/4).

    Google Scholar 

  29. Clark, N. A., & Sven, T. L. (1980). Submicrosecond bistable electro-optic switching in liquid crystals. Applied Physics Letters 36 (11), 899. Bibcode:1980ApPhL.36.899C. doi:10.1063/1.91359

  30. Display device, 1986-09-03 (Japanese publication number JP61198892)

    Google Scholar 

  31. Tidwell, M., Johnston, R. S., Melville, D., & Furness, T. A. III. (1998).The virtual retinal display – A retinal scanning imaging system. Human Interface Technology Laboratory, University of Washington.

    Google Scholar 

  32. Compact head-up display, US 4711512 A, Dec 8, 1987.

    Google Scholar 

  33. Tedesco, J. M., Owen, H., Pallister, D. M., & Morris, M. D. (1993). Principles and spectroscopic applications of volume holographic optics. Analytical Chemistry, 65(9), 441A.

    Article  Google Scholar 

  34. Mukawa, H., Akutsu, K., Matsumura, L., Nakano, S., Yoshida, T., Kuwahara, M., Aiki, K., & Ogawa, M. (2008). A full color eyewear display using holographic planar waveguides. SID 08 Digest, 39, 89–92.

    Article  Google Scholar 

  35. Amitai, Y., Reinhom, S., & Friesem, A. A. (1995). Visor-display design based on planar holographic optics. Applied Optics, 34, 1352–1356.

    Article  Google Scholar 

  36. Spitzer, C., Ferrell, U., & Ferrell, T. (Eds.). (2014, September 3). Digital avionics handbook (3rd ed.). Oxford/New York/Philadelphia: CRC Press.

    Google Scholar 

  37. Popovich, M., & Sagan, S. (2000, May). Application specific integrated lenses for displays. Society for Information Display, 31, 1060.

    Google Scholar 

  38. Cameron, A. A. (2012, May 1). Optical waveguide technology and its application in head-mounted displays. Proceedings SPIE 8383, Head- and Helmet-Mounted Displays XVII; and Display Technologies and Applications for Defense, Security, and Avionics VI, 83830E. doi:10.1117/12.923660

  39. Bleha, W. P., & Lijuan, A. L.. (2014, June 1–6). Binocular holographic waveguide visor display (SID Symposium Digest of Technical Papers, Volume 45). San Diego.

    Google Scholar 

  40. Templier, F. (Ed.), (2014, September) OLED Microdisplays: Technology and Applications, Section 7.2.2.3.3.4 Polarized waveguide, ISBN:978-1-84821-575-7, p. 256, Wiley-ISTE.

    Google Scholar 

  41. Kohno, T., Kollin, J., Molnar, D., & Roesner, F.. Display leakage and transparent wearable displays: Investigation of risk, root causes, and defenses. Microsoft Research, Tech Report, MSR-TR-2015-18.

    Google Scholar 

  42. Bernanose, A., Comte, M., & Vouaux, P. (1953). A new method of light emission by certain organic compounds. Journal de Chimie Physique, 50, 64.

    Google Scholar 

  43. Mann, S. (2001, January 28), Contact lens for the display of information such as text, graphics, or pictures, Canadian Patent 2280022, filed July 28, 1999. https://www.google.com/patents/CA2280022A1?cl=en

  44. Leonardi, M., et al. (2004, September). First steps toward noninvasive IOP – Monitoring with a sensing contact lens. Investigative Ophthalmology & Visual Science, 45, 3113–3117.

    Article  Google Scholar 

  45. http://www.sammobile.com/2016/04/05/samsung-is-working-on-smart-contact-lenses-patent-filing-reveals/

  46. http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=/netahtml/PTO/search-adv.html&r=20&p=1&f=G&l=50&d=PG01&S1=(20160407.PD.+AND+(Sony.AS.+OR+Sony.AANM.))&OS=PD/4/7/2016+and+(AN/Sony+or+AANM/Sony)&RS=(PD/20160407+AND+(AN/Sony+OR+AANM/Sony))

    Google Scholar 

  47. Lingley, A. R., Ali, M., Liao, Y., Mirjalili, R., Klonner, M., Sopanen, M., Suihkonen, S., Shen, T., Otis, B. P., & Lipsanen, H. (2011, November 22). A single-pixel wireless contact lens display. Journal of Micromechanics and Microengineering, 21(12), 125014.

    Google Scholar 

  48. Kong, Y. L., Tamargo, I. A., Kim, H., Johnson, B. N., Gupta, M. K., Koh, T.-W., Chin, H.-A., Steingart, D. A., Rand, B. P., & McAlpine, M. C. (2014, October 31). 3D Printed quantum dot light-emitting diode. Department of Mechanical and Aerospace Engineering, Princeton University, Princeton, New Jersey, Nano Letters, American Chemical Society.

    Google Scholar 

  49. Mann, S. (2013, March). My augmented life. IEEE Spectrum. http://spectrum.ieee.org/geek-life/profiles/steve-mann-my-augmediated-life

  50. https://en.wikipedia.org/wiki/Radio-frequency_identification

  51. Hacking Exposed Linux: Linux Security Secrets & Solutions (3rd ed.). McGraw-Hill Osborne Media. 2008. p. 298. ISBN:978-0-07-226257-5.

    Google Scholar 

  52. Landt, J. (2001). Shrouds of time: The history of RFID (PDF). AIM, Inc. Retrieved May 31, 2006.

    Google Scholar 

  53. Lightweight, wearable tech efficiently converts body heat to electricity. https://news.ncsu.edu/2016/09/wearable-teg-heat-harvesting-2016/#comment-7695881

  54. Sprinkling of neural dust opens door to electroceuticals. http://news.berkeley.edu/2016/08/03/sprinkling-of-neural-dust-opens-door-to-electroceuticals/

  55. Samsung’s AR vision includes smart contact lenses. http://www.technewsworld.com/story/83354.html

  56. The bionic lens: A new dimension in sight enhancement. http://ocumetics.com/

  57. Thermal vision: Graphene spans infrared spectrum. http://www.engin.umich.edu/college/about/news/stories/2014/march/infrared-detector

  58. Yu, W. J., et al. (2016). Unusually efficient photocurrent extraction in monolayer van der Waals heterostructure by tunneling through discretized barriers. Nature Communications. doi:10.1038/ncomms13278.

    Google Scholar 

  59. http://phys.org/news/2015-05-artificial-muscles-graphene-boost.html

  60. Kim, G., & Oh, Y. H. (August 2015). A radon-thoron isotope pair as a reliable earthquake precursor. Scientific Reports, 5, 13084.

    Article  Google Scholar 

  61. https://en.wikipedia.org/wiki/Fiducial_marker

  62. Mann, S., & Fung, J. (2002, April). EyeTap devices for augmented, deliberately diminished, or otherwise altered visual perception of rigid planar patches of real-world scenes. Presence, 11(2), 158–175. Massachusetts Institute of Technology.

    Google Scholar 

  63. Mann, S. (1997, April). Further developments on “HeadCam”: Joint estimation of camera rotation + gain group of transformations for wearable bi-foveated cameras. IEEE Conference on Acoustics, Speech, and Signal Processing, 4, 2909–2912.

    Google Scholar 

  64. Tang, F., Aimone, C., Fung, J., Marjan, A. and Mann, S. (2002, September 30–October 1. Seeing eye to eye: A shared mediated reality using EyeTap devices and the VideoOrbits Gyroscopic Head Tracker. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR2002), (pp. 267–268). Darmstadt, Germany.

    Google Scholar 

  65. Siltanen, S. Theory and applications of marker-based augmented reality, Copyright © VTT 2012, Julkaisija – Utgivare – Publisher (ISBN:978-951-38-7449-0).

    Google Scholar 

  66. Furht, B. (Ed.). (2011). Handbook of augmented reality. New York: Springer.

    Google Scholar 

  67. Mann, S. (1998, June 15–19). (Reality User Interface (RUI), in the paper of the Closing Keynote Address, entitled), Reconfigured Self as Basis for Humanistic Intelligence, USENIX-98, New Orleans, Published in: ATEC ‘98 Proceedings of the annual conference on USENIX Annual Technical Conference USENIX Association Berkeley, USA ©1998.

    Google Scholar 

  68. Mann, S. (2001). Intelligent image processing. San Francisco: Wiley.

    Book  Google Scholar 

  69. Asimov, I. (1947, March). Little Lost Robot, short story, Astounding Science Fiction, 39(1), Street & Smith.

    Google Scholar 

  70. Davis, K. H., Biddulph, R., & Balashek, S. (1952). Automatic recognition of spoken digits. Journal of the Acoustical Society of America., 24, 637–642.

    Article  Google Scholar 

  71. http://www-03.ibm.com/ibm/history/exhibits/specialprod1/specialprod1_7.html

  72. https://en.wikipedia.org/wiki/Star_Trek

  73. https://en.wikipedia.org/wiki/HAL_9000

  74. Chen, F., & Jokinen, K. (Eds.). (2010). Speech technology: Theory and applications. New York/Dordrecht/Heidelberg/London: Springer.

    Google Scholar 

  75. Sturman, D. J., & Zeltzer, D. (1994). A survey of glove-based input. IEEE Computer Graphics & Applications, p. 30, http://www.pcworld.com/article/243060/speech_recognition_through_the_decades_how_we_ended_up_with_siri.html

  76. Waibel, A., & Lee, K.-F. (Eds.). (1990). Readings in speech recognition. San Mateo: Morgan Kaufmann.

    Google Scholar 

  77. Mirzaei, M. R., Ghorshi, S., & Mortazavi, M. (2014, March). Audio-visual speech recognition techniques in augmented reality environments. The Visual Computer, 30(3), 245–257. doi:10.1007/s00371-013-0841-1

  78. Dick, P. K. (2002). Minority report. London: Gollancz. (ISBN:1-85798-738-1 or ISBN:0-575-07478-7).

    Google Scholar 

  79. Technologies in Minority Report. https://en.wikipedia.org/wiki/Technologies_in_Minority_Report

  80. Premaratne, P.. (2014). Human computer interaction using hand gestures. Sinapore/Heidelberg/New York: Springer.

    Google Scholar 

  81. Dipietro, L., Sabatini, A. M., & Dario, P. (2008, July). A survey of glove-based systems and their applications. IEEE Transactions on Systems, Man, and Cybernetics—part c: Applications and Reviews, 38(4), 461–482.

    Article  Google Scholar 

  82. The London Mercury Vol.XVII No.99 1928.

    Google Scholar 

  83. Lyon Branden Transcript of Gesture Recognition. (2013, April 23). 500. https://prezi.com/piqvjf2g-eec/gesture-recognition/

  84. Pavlovic, V. I., Sharma, R., & Huang, T. S. (1997, July). Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 677.

    Article  Google Scholar 

  85. Maes, P., Darrell, T., Blumberg, B., & Pentland, A.. 1995, April 19–21. The ALIVE system: full-body interaction with autonomous agents. Proceeding CA ‘95 Proceedings of the Computer Animation, p. 11.

    Google Scholar 

  86. Periverzov, F., & Ilies, H. T.. 3D Imaging for hand gesture recognition: Exploring the software-hardware interaction of current technologies, Department of Mechanical Engineering, University of Connecticut, http://cdl.engr.uconn.edu/publications/pdfs/3dr.pdf

  87. Varga, E., Horv’ath, I., Rus’ak, Z., & Broek, J.. (2004). Hand motion processing in applications: A concise survey and analysis of technologies. Proceedings of the 8th International Design Conference DESIGN 2004 (pp. 811–816).

    Google Scholar 

  88. Erol, A., Bebis, G., Nicolescu, M., Boyle, R., & Twombly, X. (2007). Vision-based hand pose estimation: A review. Computer Vision and Image Understanding, 108(1–2), 52–73.

    Article  Google Scholar 

  89. Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 134(3), 372–422. doi:10.1037/0033-2909.124.3.372. Retrieved June 17, 2011.

    Article  Google Scholar 

  90. https://en.wikipedia.org/wiki/Eye_tracking

  91. http://eyesee-research.com/news/eye-ttacking-through-history/

  92. Sigut, J., & Sidha, S. A.. (2011, February). Iris center corneal reflection method for gaze tracking using visible light. IEEE Transactions on Biomedical Engineering, 58(2), 411–419. doi:10.1109/TBME.2010.2087330. Epub 2010 Oct 14.

  93. David, P. (2012, December 1). 6 Electronic devices you can control with your thoughts. Scientific American

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Peddie, J. (2017). Technology Issues. In: Augmented Reality . Springer, Cham. https://doi.org/10.1007/978-3-319-54502-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-54502-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-54501-1

  • Online ISBN: 978-3-319-54502-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics