Abstract
In this section, some of the key technical issues in building an augmented reality device are discussed. The number of sensors, and the technology associated with them, combined with the need to operate at low power and be lightweight, is an on-going challenge for the device builders.
Augmented reality systems (headsets, helmets, HUDs, etc.) must interact with our eyes and brain, and the eye-brain connection is a very powerful, amazingly complex and capable system.
Knowing where you are is one of the most critical functions in augmented reality. How can a system identify things and deliver potentially mission critical information in a timely manner if it doesn’t know where you are? But how does it know where you are?
The concept of voice control, like augmented and virtual reality, are terms that have been in our vocabulary for so long, many think they know what they are and how they work.
The use of hand gestures as a means of communicating and controlling the information provided by augmented reality systems provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI); hand gestures can help in achieving the ease and naturalness.
Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. Eye-tracking is an old concept developed in the 1800s made using direct observations.
If ever the statement, “one size does not fit all,” was appropriate, it would be in the case of a user interface.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The vestibulo-ocular reflex (VOR) is a reflex, where activation of the vestibular system causes eye movement. This reflex functions to stabilize images on the retinas during head movement by producing eye movements in the direction opposite to head movement, thus preserving the image on the center of the visual field(s).
- 2.
Chromatic Aberration, also known as “color fringing” or “purple fringing”, is a common optical problem that occurs when a lens is either unable to bring all wavelengths of color to the same focal plane, and/or when wavelengths of color are focused at different positions in the focal plane.
- 3.
FLCOS was contemplated for holography due to its high speed, but never implemented because the pixels were too big. To display a holographic interference pattern, you need large arrays of very small pixels.
- 4.
Nit (nt) is a unit of luminance. It is a non-SI name used for candela per square meter (1 nt = 1 cd/m2). The candela per square meter (cd/m2) is the derived SI unit of luminance. The unit is based on the candela, the SI unit of luminous intensity, and the square meter, the SI unit of area.
- 5.
In optics, the exit pupil is a virtual aperture in an optical system. Only rays which pass through this virtual aperture can exit the system.
References
Ali, M. A., & Klyne, M. A. (1985). Vision in vertebrates. New York: Plenum Press. ISBN:0-306-42065-1.
Womelsdorf, T., et al. (2006). Dynamic shifts of visual receptive fields in cortical area MT by spatial attention. Nature Neuroscience, 9, 1156–1160. 105.
Westheimer, G., & McKee, S. P. (1978). Stereoscopic acuity for moving retinal images. Journal of the Optical Society of America, 68(4), 450–455.
Optical resolution. https://en.wikipedia.org/wiki/Optical_resolution
Prince, S. (2010). Through the looking glass: Philosophical toys and digital visual effects. Projections, 4.
Wertheimer, M. (1912, April). Experimentelle Studien über das Sehen von Bewegung [Experimental Studies on Motion Vision] (PDF). Zeitschrift für Psychologie., 61(1), 161–265.
Padmos, P., & Milders, M. V. (1992). Quality criteria for simulator images: A literature review. Human Factors, 34(6), 727–748.
Pasman, W., van der Schaaf, A., Lagendijk, R. L., & Jansen, F. W. 1999, December 6. Low latency rendering for mobile augmented reality. Computers & Graphics, 23, 875–881 (international journal/conference paper), Ubicom-Publication.
Amin, M. S., & Meyers, A. D. (2012, February). Vestibuloocular reflex testing. http://emedicine.medscape.com/article/1836134-overview
Bailey, R. E., Arthur III, J. J. (Trey), & Williams, S. P.. Latency requirements for head-worn display S/EVS applications, NASA Langley Research Center, Hampton. https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20120009198.pdf
Lincoln, P., Blate, A., Singh, M., Whitted, T., State, A., Lastra, A., & Fuchs, H. (2016, April). From motion to photons in 80 Microseconds: Towards minimal latency for virtual and augmented reality. IEEE Transactions on Visualization and Computer Graphics, 22(4), 1367–1376.
Slotten, H. R. (2000). Radio and television regulation: Broadcast technology in the United States 1920–1960. JHU Press. ISBN:0-8018-6450-X. “C.B.S. Color Video Starts Nov. 20; Adapters Needed by Present Sets”, New York Times, Oct. 12, 1950, p. 1.
Hoffman, D. M., Girshick, A. R., Akeley, K., & Banks, M. S. (2008, March). Vergence–accommodation conflicts Hinder visual performance and cause visual fatigue. Journal of Vision, Research Article, 8, 33.
Zheng, F., Whitted, T., Lastra, A., Lincoln, P., State, A., Maimonek, A., & Fuchs, H. 2014. Minimizing latency for augmented reality displays: Frames considered harmful. IEEE International Symposium on Mixed and Augmented Reality. p. 195
Rolland, J. P., & Fuchs, H. (2000). Optical versus video see-through head mounted displays. Presence: Teleoperators and Virtual Environments, 9(3), 287–309.
Holloway, R. L. (1997). Registration error analysis for augmented reality. Presence, 6(4), 413–432.
Barfield, W. (Ed.). (2016). Fundamentals of wearable computers and augmented reality (2nd ed.). Boca Raton: CRC Press/Taylor & Francis Group.
Karim, M. A. (1992). Electro-optical displays. New York: Marcel Dekker, Inc. ISBN:0.8247-8695-5.
Murray, W. S.. (2003, August). The eye-movement engine. Behavioral and Brain Sciences, 26(04), 446–495. Cambridge University Press.
https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment
Comeau, C. P., & Bryan, J. S. (1961, November 10). Headsight television system provides remote surveillance. Electronics, 34, 86–90.
Bionic eye will send images direct to the brain to restore sight. New Scientist. https://www.newscientist.com/article/mg22830521-700-bionic-eye-will-send-images-direct-to-the-brain-to-restore-sight/
Zrenner, E., et al. (2010). Subretinal electronic chips allow blind patients to read letters and combine them to words. Proceedings of the Royal Society B. doi:10.1098/rspb.2010.1747.
Brody, T. P. (1997). Birth of the active matrix. Information Display, 13(10), 28–32.
Armitage, D., et al. (2006). Introduction to microdisplays. Chichester: Wiley. ISBN:978-0-470-85-281-1.
IBM. (1998). Special session for high-resolution displays. IBM Journel of Research and Development, 42(3/4).
Clark, N. A., & Sven, T. L. (1980). Submicrosecond bistable electro-optic switching in liquid crystals. Applied Physics Letters 36 (11), 899. Bibcode:1980ApPhL.36.899C. doi:10.1063/1.91359
Display device, 1986-09-03 (Japanese publication number JP61198892)
Tidwell, M., Johnston, R. S., Melville, D., & Furness, T. A. III. (1998).The virtual retinal display – A retinal scanning imaging system. Human Interface Technology Laboratory, University of Washington.
Compact head-up display, US 4711512 A, Dec 8, 1987.
Tedesco, J. M., Owen, H., Pallister, D. M., & Morris, M. D. (1993). Principles and spectroscopic applications of volume holographic optics. Analytical Chemistry, 65(9), 441A.
Mukawa, H., Akutsu, K., Matsumura, L., Nakano, S., Yoshida, T., Kuwahara, M., Aiki, K., & Ogawa, M. (2008). A full color eyewear display using holographic planar waveguides. SID 08 Digest, 39, 89–92.
Amitai, Y., Reinhom, S., & Friesem, A. A. (1995). Visor-display design based on planar holographic optics. Applied Optics, 34, 1352–1356.
Spitzer, C., Ferrell, U., & Ferrell, T. (Eds.). (2014, September 3). Digital avionics handbook (3rd ed.). Oxford/New York/Philadelphia: CRC Press.
Popovich, M., & Sagan, S. (2000, May). Application specific integrated lenses for displays. Society for Information Display, 31, 1060.
Cameron, A. A. (2012, May 1). Optical waveguide technology and its application in head-mounted displays. Proceedings SPIE 8383, Head- and Helmet-Mounted Displays XVII; and Display Technologies and Applications for Defense, Security, and Avionics VI, 83830E. doi:10.1117/12.923660
Bleha, W. P., & Lijuan, A. L.. (2014, June 1–6). Binocular holographic waveguide visor display (SID Symposium Digest of Technical Papers, Volume 45). San Diego.
Templier, F. (Ed.), (2014, September) OLED Microdisplays: Technology and Applications, Section 7.2.2.3.3.4 Polarized waveguide, ISBN:978-1-84821-575-7, p. 256, Wiley-ISTE.
Kohno, T., Kollin, J., Molnar, D., & Roesner, F.. Display leakage and transparent wearable displays: Investigation of risk, root causes, and defenses. Microsoft Research, Tech Report, MSR-TR-2015-18.
Bernanose, A., Comte, M., & Vouaux, P. (1953). A new method of light emission by certain organic compounds. Journal de Chimie Physique, 50, 64.
Mann, S. (2001, January 28), Contact lens for the display of information such as text, graphics, or pictures, Canadian Patent 2280022, filed July 28, 1999. https://www.google.com/patents/CA2280022A1?cl=en
Leonardi, M., et al. (2004, September). First steps toward noninvasive IOP – Monitoring with a sensing contact lens. Investigative Ophthalmology & Visual Science, 45, 3113–3117.
http://appft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&u=/netahtml/PTO/search-adv.html&r=20&p=1&f=G&l=50&d=PG01&S1=(20160407.PD.+AND+(Sony.AS.+OR+Sony.AANM.))&OS=PD/4/7/2016+and+(AN/Sony+or+AANM/Sony)&RS=(PD/20160407+AND+(AN/Sony+OR+AANM/Sony))
Lingley, A. R., Ali, M., Liao, Y., Mirjalili, R., Klonner, M., Sopanen, M., Suihkonen, S., Shen, T., Otis, B. P., & Lipsanen, H. (2011, November 22). A single-pixel wireless contact lens display. Journal of Micromechanics and Microengineering, 21(12), 125014.
Kong, Y. L., Tamargo, I. A., Kim, H., Johnson, B. N., Gupta, M. K., Koh, T.-W., Chin, H.-A., Steingart, D. A., Rand, B. P., & McAlpine, M. C. (2014, October 31). 3D Printed quantum dot light-emitting diode. Department of Mechanical and Aerospace Engineering, Princeton University, Princeton, New Jersey, Nano Letters, American Chemical Society.
Mann, S. (2013, March). My augmented life. IEEE Spectrum. http://spectrum.ieee.org/geek-life/profiles/steve-mann-my-augmediated-life
https://en.wikipedia.org/wiki/Radio-frequency_identification
Hacking Exposed Linux: Linux Security Secrets & Solutions (3rd ed.). McGraw-Hill Osborne Media. 2008. p. 298. ISBN:978-0-07-226257-5.
Landt, J. (2001). Shrouds of time: The history of RFID (PDF). AIM, Inc. Retrieved May 31, 2006.
Lightweight, wearable tech efficiently converts body heat to electricity. https://news.ncsu.edu/2016/09/wearable-teg-heat-harvesting-2016/#comment-7695881
Sprinkling of neural dust opens door to electroceuticals. http://news.berkeley.edu/2016/08/03/sprinkling-of-neural-dust-opens-door-to-electroceuticals/
Samsung’s AR vision includes smart contact lenses. http://www.technewsworld.com/story/83354.html
The bionic lens: A new dimension in sight enhancement. http://ocumetics.com/
Thermal vision: Graphene spans infrared spectrum. http://www.engin.umich.edu/college/about/news/stories/2014/march/infrared-detector
Yu, W. J., et al. (2016). Unusually efficient photocurrent extraction in monolayer van der Waals heterostructure by tunneling through discretized barriers. Nature Communications. doi:10.1038/ncomms13278.
http://phys.org/news/2015-05-artificial-muscles-graphene-boost.html
Kim, G., & Oh, Y. H. (August 2015). A radon-thoron isotope pair as a reliable earthquake precursor. Scientific Reports, 5, 13084.
Mann, S., & Fung, J. (2002, April). EyeTap devices for augmented, deliberately diminished, or otherwise altered visual perception of rigid planar patches of real-world scenes. Presence, 11(2), 158–175. Massachusetts Institute of Technology.
Mann, S. (1997, April). Further developments on “HeadCam”: Joint estimation of camera rotation + gain group of transformations for wearable bi-foveated cameras. IEEE Conference on Acoustics, Speech, and Signal Processing, 4, 2909–2912.
Tang, F., Aimone, C., Fung, J., Marjan, A. and Mann, S. (2002, September 30–October 1. Seeing eye to eye: A shared mediated reality using EyeTap devices and the VideoOrbits Gyroscopic Head Tracker. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR2002), (pp. 267–268). Darmstadt, Germany.
Siltanen, S. Theory and applications of marker-based augmented reality, Copyright © VTT 2012, Julkaisija – Utgivare – Publisher (ISBN:978-951-38-7449-0).
Furht, B. (Ed.). (2011). Handbook of augmented reality. New York: Springer.
Mann, S. (1998, June 15–19). (Reality User Interface (RUI), in the paper of the Closing Keynote Address, entitled), Reconfigured Self as Basis for Humanistic Intelligence, USENIX-98, New Orleans, Published in: ATEC ‘98 Proceedings of the annual conference on USENIX Annual Technical Conference USENIX Association Berkeley, USA ©1998.
Mann, S. (2001). Intelligent image processing. San Francisco: Wiley.
Asimov, I. (1947, March). Little Lost Robot, short story, Astounding Science Fiction, 39(1), Street & Smith.
Davis, K. H., Biddulph, R., & Balashek, S. (1952). Automatic recognition of spoken digits. Journal of the Acoustical Society of America., 24, 637–642.
http://www-03.ibm.com/ibm/history/exhibits/specialprod1/specialprod1_7.html
Chen, F., & Jokinen, K. (Eds.). (2010). Speech technology: Theory and applications. New York/Dordrecht/Heidelberg/London: Springer.
Sturman, D. J., & Zeltzer, D. (1994). A survey of glove-based input. IEEE Computer Graphics & Applications, p. 30, http://www.pcworld.com/article/243060/speech_recognition_through_the_decades_how_we_ended_up_with_siri.html
Waibel, A., & Lee, K.-F. (Eds.). (1990). Readings in speech recognition. San Mateo: Morgan Kaufmann.
Mirzaei, M. R., Ghorshi, S., & Mortazavi, M. (2014, March). Audio-visual speech recognition techniques in augmented reality environments. The Visual Computer, 30(3), 245–257. doi:10.1007/s00371-013-0841-1
Dick, P. K. (2002). Minority report. London: Gollancz. (ISBN:1-85798-738-1 or ISBN:0-575-07478-7).
Technologies in Minority Report. https://en.wikipedia.org/wiki/Technologies_in_Minority_Report
Premaratne, P.. (2014). Human computer interaction using hand gestures. Sinapore/Heidelberg/New York: Springer.
Dipietro, L., Sabatini, A. M., & Dario, P. (2008, July). A survey of glove-based systems and their applications. IEEE Transactions on Systems, Man, and Cybernetics—part c: Applications and Reviews, 38(4), 461–482.
The London Mercury Vol.XVII No.99 1928.
Lyon Branden Transcript of Gesture Recognition. (2013, April 23). 500. https://prezi.com/piqvjf2g-eec/gesture-recognition/
Pavlovic, V. I., Sharma, R., & Huang, T. S. (1997, July). Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 677.
Maes, P., Darrell, T., Blumberg, B., & Pentland, A.. 1995, April 19–21. The ALIVE system: full-body interaction with autonomous agents. Proceeding CA ‘95 Proceedings of the Computer Animation, p. 11.
Periverzov, F., & Ilies, H. T.. 3D Imaging for hand gesture recognition: Exploring the software-hardware interaction of current technologies, Department of Mechanical Engineering, University of Connecticut, http://cdl.engr.uconn.edu/publications/pdfs/3dr.pdf
Varga, E., Horv’ath, I., Rus’ak, Z., & Broek, J.. (2004). Hand motion processing in applications: A concise survey and analysis of technologies. Proceedings of the 8th International Design Conference DESIGN 2004 (pp. 811–816).
Erol, A., Bebis, G., Nicolescu, M., Boyle, R., & Twombly, X. (2007). Vision-based hand pose estimation: A review. Computer Vision and Image Understanding, 108(1–2), 52–73.
Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 134(3), 372–422. doi:10.1037/0033-2909.124.3.372. Retrieved June 17, 2011.
http://eyesee-research.com/news/eye-ttacking-through-history/
Sigut, J., & Sidha, S. A.. (2011, February). Iris center corneal reflection method for gaze tracking using visible light. IEEE Transactions on Biomedical Engineering, 58(2), 411–419. doi:10.1109/TBME.2010.2087330. Epub 2010 Oct 14.
David, P. (2012, December 1). 6 Electronic devices you can control with your thoughts. Scientific American
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Peddie, J. (2017). Technology Issues. In: Augmented Reality . Springer, Cham. https://doi.org/10.1007/978-3-319-54502-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-54502-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-54501-1
Online ISBN: 978-3-319-54502-8
eBook Packages: Computer ScienceComputer Science (R0)