Advertisement

EHealth Applications for Those in Need: Making Novel Interaction Technologies Accessible

  • Martin BöckerEmail author
  • Matthias Schneider
Chapter
Part of the Human–Computer Interaction Series book series (HCIS)

Abstract

Barrier-free usability is a precondition for eHealth applications for all. While novel and innovative user interface technologies have the potential of increasing the usability of a device, they may also impose barriers for users with physical impairments. These barriers are often removed through later updates and workarounds, but in other cases never. ETSI, the European Telecommunications Standards Institute, has published an ETSI Guide (EG) [1] that anticipates new interaction technologies [2] in the form of technology roadmaps, pointing out potential barriers individual technologies may create for specific user groups. That document also lists measures that can be taken in order to overcome these barriers so that the technologies can be introduced to the market in barrier-free versions right from their first implementation. In this chapter, ETSI’s approach to identifying relevant technology areas and the resulting technology roadmaps are presented in the form of examples and scenarios.

Keywords

Interaction Modality Smart Home Service Component Interaction Technology Blind User 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

The work reported here is the result of a team of experts. We thank the members of this team for their effort and valuable contributions: Michael Pluke, Erik Zetterström, Helge Hüttenrauch, and Alejandro Rodriguez-Ascaso. Funding for this work was provided by the European Commission.

References

  1. 1.
    ETSI EG 202 848 V1.1.1 (2011–02). Human factors; Inclusive eServices for all: Optimizing the accessibility and the use of upcoming user-interaction technologies.Google Scholar
  2. 2.
    Kortum, P. (Ed.). (2008). HCI Beyond the GUI: Design for haptic, speech, olfactory, and other nontraditional interfaces. Burlington, MA: Morgan Kaufmann Publishers.Google Scholar
  3. 3.
    ETSI EG 202 116 (2002). Human factors (HF); Guidelines for ICT products and services; Design for all.Google Scholar
  4. 4.
    Henderson, R. M. & Clark, K. B. (1990). Architectural innovation: The reconfiguration of existing product technologies and the failure of established firms. Administrative Science Quarterly, 35(1), 9–30.Google Scholar
  5. 5.
    Rogers, E. M. (1962). Diffusion of innovations. Florence, MA: Free Press.Google Scholar
  6. 6.
  7. 7.
    ETSI TR 102 849 V1.1.1 (2010–11). Human factors (HF); Inclusive eServices for all; Background analysis of future interaction technologies and supporting information.Google Scholar
  8. 8.
    ISO 9241–20: Ergonomics of human-system interaction. Accessibility guidelines for information/communication technology (ICT) equipment and services.Google Scholar
  9. 9.
    The Center for Universal Design, NC State University.Retrieved December 15, 2013, from http://www.ncsu.edu/ncsu/design/cud/about_ud/udprinciplestext.htm
  10. 10.
    ETSI TR 102 068: Human factors (HF); Requirements for assistive technology devices in ICT.Google Scholar
  11. 11.
    ETSI EG 202 417: Human factors (HF); User education guidelines for mobile terminals and services.Google Scholar
  12. 12.
    Ahmaniemi, T. T., & Lantz, V. T. (2009). Augmented reality target finding based on tactile cues. Proceedings of International Conference on Multimodal interfaces (pp. 335–342).Google Scholar
  13. 13.
    Belt, S., Greenblatt, D., Häkkilä, J., & Mäkelä, K. (2006). User Perceptions on mobile interaction with visual and RFID tags. In E. Rukzio, M. Paolucci, T. Finin, P. Wisner, & T. Payne (Eds). Proceedings of the 8th Conference on Human–Computer Interaction with Mobile Devices and Services—MobileHCI ’06 (p. 295). New York: ACM Press. doi: 10.1145/1152215.1152296
  14. 14.
    Bolzmacher, C., Hafez, M., Khoudjaa, M., Bernardonia, P., & Dubowsky, S. (2004). Polymer based actuators for virtual reality devices. Proceedings of SPIE, 5385, 281–289.CrossRefGoogle Scholar
  15. 15.
    Boverie, S. (2004). Driver fatigue monitoring technologies and future ideas. AWAKE Road Safety Workshop, Balocco, Italy.Google Scholar
  16. 16.
    Bravo, J., Hervas, R., Chavira, G., Nava, S. W., & Villarreal, V. (2008). From implicit to touching interaction: RFID and NFC approaches. 2008 Conference on Human System Interactions (pp. 743–748), Krakow, Poland. IEEE. doi: 10.1109/HSI.2008.4581534
  17. 17.
    Brugnoli, M. C., Rowland, D., Morabito, F., Davide, F., & Doughty, M. (2006). Gaming and social interaction in mediated environments: The PASION project. eChallenges e2006, Barcelona, Spain.Google Scholar
  18. 18.
    Callaghan, M. J., Gormley, P., McBride, M., Harkin, J., & McGinnity, T. M. (2006). Internal location based services using wireless sensor networks and RFID technology. Journal of Computer Science, 6(4), 108–113.Google Scholar
  19. 19.
    Campbell, A. T., Eisenman, S. B., Fodor, K., Lane, N. D., Lu, H., Miluzzo, E., et al. (2008). Transforming the social networking experience with sensing presence from mobile phones. Proceedings of the 6th ACM Conference on Embedded Network Sensor Systems—SenSys ’08 (p. 367). New York: ACM Press. doi: 10.1145/1460412.1460455
  20. 20.
    Ferris, D. P. (2009). The exoskeletons are here. Journal of NeuroEngineering and Rehabilitation, 6(17). Retrieved from http://www.jneuroengrehab.com/content/6/1/17
  21. 21.
    Furmanski, C., Azuma, R., & Daily, M. (2002). Augmented-reality visualizations guided by cognition: Perceptual heuristics for combining visible and obscured information. Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR’02) (pp. 215–224).Google Scholar
  22. 22.
    Gabbard, J. L., Swan, J. E., Hix, D., Si-Jung, K. & Fitch, G. (2007). Active text drawing styles for outdoor augmented reality: A user-based study and design implications. Proceedings of the Virtual Reality Conference (pp. 35–42).Google Scholar
  23. 23.
    Haans, A., Ijsselsteijn, W. A, & de Kort, Y. A. W. (2008). The effect of similarities in skin texture and hand shape on perceived ownership of a fake limb. Body Image, 5(4), 389–94. doi: 10.1016/j.bodyim.2008.04.003
  24. 24.
    Hage, J. (2011). Restoring the innovative Edge: Driving the evolution of science and technology. Stanford, CA: Stanford Business Books.Google Scholar
  25. 25.
    Herr, H. (2009). Exoskeletons and orthoses: Classification, design challenges and future directions. Journal of NeuroEngineering and Rehabilitation, 6(21). Retrieved January 19, 2014, from: http://www.jneuroengrehab.com/content/6/1/21
  26. 26.
    Hightower, J., Vakili, C., Borriello, C., & Want, R. (2001). Design and calibration of the SpotON AD-Hoc location sensing system. Seattle: Department of Computer Science and Engineering, University of Washington.Google Scholar
  27. 27.
    Hong, Z. T., & Pentland, A. (2001). Tactual displays for sensory substitution and wearable computers. In W. Barfield, T. Caudell, & N. J. Mahwah (Eds.), Fundamentals of Wearable Computers and Augmented Reality (pp. 579–598). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  28. 28.
    Hoshi, T., Iwamoto, T., & Shinoda, H. (2009). Non-contact tactile sensation synthesized by ultrasound transducers. Proceedings of the Third Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (pp. 256–260).Google Scholar
  29. 29.
    ISO/IEC TR 24714–1. Information technology—biometrics—jurisdictional and societal considerations for commercial applications. Part 1: General guidance (E).Google Scholar
  30. 30.
    Jaynes, C., Webb, S., & Steele, R. M. (2004). Camera-based detection and removal of shadows from interactive multiprojector displays. IEEE Transactions on Visualization and Computer Graphics, 10(3), 290–301.CrossRefGoogle Scholar
  31. 31.
    Jones, L. A., & Sarter, N. B. (2008). Tactile displays: Guidance for their design and application human factors. The Journal of the Human Factors and Ergonomics Society, 50(1), 90–111.CrossRefGoogle Scholar
  32. 32.
    Konomi, S. (2004). Personal privacy assistants for RFID users. International Workshop Series on RFID (pp. 1–6).Google Scholar
  33. 33.
    Kooper, R., & MacIntyre, B. (2003). Browsing the real-world wide web: Maintaining awareness of virtual information in an ar information space. International Journal of Human-Computer Interaction, 16(3), 425–446.CrossRefGoogle Scholar
  34. 34.
    Kraft, C. (2012). User experience innovation—user-centered design that works. NY: Apress.Google Scholar
  35. 35.
    Langheinrich, M. (2005). Personal privacy in ubiquitous computing tools and system support. PhD Thesis No. 16100, ETH Zurich, Zurich, Switzerland, May 2005.Google Scholar
  36. 36.
    Laycock, S. D., & Day, A. M. (2003). Recent developments and applications of haptic devices. Computer Graphics Forum, 22(2), 117–132.CrossRefGoogle Scholar
  37. 37.
    Liu, Y. C., & Wen, H. C. (2004). Comparison of head-up display (HUD) versus head-down display (HDD): Driving performance of commercial vehicle operators in Taiwan. International Journal of Human-Computer Studies, 61, 679–697.CrossRefGoogle Scholar
  38. 38.
    Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. (1994). Augmented reality: A class of displays on the reality-virtuality continuum. Proceedings of SPIE, 2351, 282–292.CrossRefGoogle Scholar
  39. 39.
    Orr, R., & Abowd, G. (2000). The smart floor: A mechanism for natural user identification and tracking. In G. Szwillus & T. Turner (Eds.), CHI2000 Extended abstracts conference on human factors in computing systems (pp. 275–276). The Hague, Netherlands: ACM Press.Google Scholar
  40. 40.
    Parviz, B. A. (2009). Augmented reality in a contact lens. IEEE Spectrum. Retrieved from http://spectrum.ieee.org/biomedical/bionics/augmented-reality-in-a-contact-lens/0
  41. 41.
    Rantala, J., Raisamo, R., Lylykangas, J., Surakka, V., Raisamo, J., Salminen, K., et al. (2009). Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Transactions on Haptics, 2(1), 28–39.Google Scholar
  42. 42.
    Ruffini, G., Dunne, S., Farrés, E., Cester, I., Watts, P. C. P., Silva, S. R. P., et al. (2007). ENOBIO dry electrophysiology electrode: First human trial plus wireless electrode system. Proceedings of the 29th Annual International Conference of the IEEE Engineering Medicine and Biology Society, Lyon, France (pp. 6690–6694). IEEE.Google Scholar
  43. 43.
    Saffer, D. (2009). Designing for interaction: Creating innovative applications and devices (voices that matter). Berkeley, CA: New Riders.Google Scholar
  44. 44.
    Stephanidis, C. (2007). Universal access in human-computer interaction ambient interaction. 4th International Conference on Universal Access in Human-Computer Interaction, UAHCI 2007. Heidelberg: Springer.Google Scholar
  45. 45.
    Stephanidis, C. (2009). The universal access handbook (human factors and ergonomics). Boca Raton, USA: CRC Press.Google Scholar
  46. 46.
    Sun, W., Sobel, I., Culbertson, B., Gelb, D., & Robinson, I. (2008). Calibrating multi-projector cylindrically curved displays for wallpaper projection. Proceedings of the 5th ACM/IEEE International Workshop on Projector Camera Systems (pp. 1–8).Google Scholar
  47. 47.
    Tilton, C. (2002). Biometric standards—An overview. Information Security Technical Report, 7(4), 36–48. doi: 10.1016/S1363-4127(02)00405-3
  48. 48.
    Vertegaal, R., & Poupyrev, I. (2008). Organic user interfaces: Introduction. Communications of the ACM, 51(6), 26–30.CrossRefGoogle Scholar
  49. 49.
    Want, R., Fishkin, K. P., Gujar, A., & Harrison, B. L. (1999). Bridging physical and virtual worlds with electronic tags. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems the CHI is the Limit—CHI ’99 (pp. 370–377). New York: ACM Press. doi: 10.1145/302979.303111
  50. 50.
    Wunschmann, W., & Fourney, D. (2005). Guidance on tactile human-system interaction: Some statements. Proceedings of Guidelines on Tactile and Haptic, Interactions (GOTHI’05) (pp.6–9).Google Scholar
  51. 51.
    Xueyan, L., & Shuxu, G. (2008): The fourth biometric—vein recognition. In P.-Y. Yin (Ed.), Pattern recognition techniques, technology and applications (pp. 537–546). Retrieved January 19, 2014, from http://sciyo.com/articles/show/title/the_fourth_biometric_-_vein_recognition
  52. 52.
    Yousefi, A., Jalili, R., & Niamanesh, M. (2006). Multi-determiner protection of private data in pervasive computing environments. IJCSNS International Journal of Computer Science and Network Security, 6(12), 239–248.Google Scholar
  53. 53.
    Zhou, Z., Cheok, A. D., Yang, X., & Qiu, Y. (2004). An experimental study on the role of 3D sound in augmented reality environment. Interacting with Computers, 16(6), 1043–1068.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  1. 1.Böcker und Schneider GbRKonstanzGermany
  2. 2.Böcker und Schneider GbRMünchenGermany

Personalised recommendations