Abstract
This research addresses the lack of an existing, comprehensive method to help developers evaluate and gather requirements for the evaluation and/or design of technological solutions for the visually impaired. This paper, utilizing interviews with experts and the visually impaired, focuses on using the “Technology Enhanced Accessible Interaction Framework Method”.
You have full access to this open access chapter, Download conference paper PDF
1 Introduction
The Technology Enhanced Interaction Framework (TEIF) Method was developed to technologically enhance accessible interactions between people, technology, and objects, particularly in face-to-face situations involving people with disabilities. It was successfully validated by three developer experts, three accessibility experts, and an HCI professor for use with hearing impaired people [1]. The TEIF Method supports other design methods by providing multiple-choice questions to help identify requirements, the answers to which help provide technological suggestions that support the design stage. This paper described how the TEIF Method has been extended for use with visually impaired people. Ten experts, 20 visual impaired students and 10 visual impaired adults were interviewed in order to create scenarios, and investigate problems of visual impairment problems and their subsequent technological solutions.
2 Literature Review About Visual Impaired Requirements
The Individuals with Disabilities Education Act (IDEA) defines the term “visual impairment” as impairment in vision that, even with correction, adversely affects a child’s educational performance. The term includes both partial sight and blindness [2]. Nearly 11% of Thailand’s registered disabled population in 1996 had a visual
Impairment, and the National Statistics Office 2007 data estimates that nearly two million women and men in Thailand, or approximately 3% of the population, or 2209,000 people had a registered visual impairment disability, [3].
To reduce discrimination in access requires accessible technology solutions, an accessible environment, accessible documents and accessibility awareness.
2.1 How Can Blind People Get Information?
Golledge [4] analyzed the four senses involved in a navigation task:
-
1.
Touch is a tactile perception ability which acquires information from objects pressed against the skin, using Mechanoreceptors. These utilize neural receptors to detect a pressure on human skin when it is touched e.g. pressure on hands, feet, follicle, tongue and body skin.
-
2.
Sight is vision perception, which includes focusing, interpreting, and detecting visible light that bounces off and reflects from objects into the eyes. It provides information such as images, colours, brightness, and contrast.
-
3.
Audition is sound perception, namely the ability to detect and interpret the vibration into various frequencies of noise in the inner ears. Hearing capability using both ears also provides the ability to echolocate, that is, detect orientation of the sound source. (Milne et al. [5]; Wallmeier and Wiegrebe [6]).
-
4.
Olfaction is Odour perception, ability to smell objects in the environment, utilizing the Olfactory neural receptor.
Visual impairment obstruction detection at different levels is shown in Table 1. For example, visually impaired people use a white can to detect obstructions at ground level. They use guide dogs to avoid the obstruction and/or sighted people to avoid the obstruction.
2.2 Problem and Solutions of Visual Impairment
Problems and solutions experienced by the visually impaired and blind. People are shown in Table 2.
4 Developing TEIF for Blind
4.1 Interviews
The research analyses the information gathered from the experts and visually impaired people to develop requirement questions, five possible scenarios with actions, interaction issues, and possible technologies.
4.2 Transforming Requirements into Questions and Multiple Choices
The TEIF Method helps developers gather and evaluate requirements by using TEIF based multiple-choice questions. The questions help identify issues for which a technology solution is required.
In the following example, requirement questions ☐ means more than one answer can be chosen and ◯ means only one answer can be chosen. The example requirement questions which are shown below only include questions for which correct answers in the given scenario are provided.
4.3 Develop Scenario to Test Requirement Questions and Multiple Choices
In order to ensure that the TEIF has broad applicability,, five scenarios and technology solutions were considered during the development process: a blind person shops for groceries, crosses the road, finds rooms and buildings, studies at the University, and visits the Shadow Puppet Museum. The process illustrated TEIF suitability in these complex situations involving visual imparement, and addressed the specific aspects of these technologically- enhanced interactions.
Table 4 shows how the questions can be applied to the relationship between the multiple-choice requirement questions and answers for these five scenarios.
Scenario 4: Problem of a blind students studying at the University
Space limitations allow only one of the scenarios to be described in detail. “Golf” is the only blind student in the law faculty class. Golf normally sits in a front of the class as he wants to record the lectures. However, (1) there is a lot of noise as teachers do not use microphones and other students are also talking during the class. Therefore, the sound quality of the media file that he records is not so good. Golf uses Braille to take notes from the lecture sometimes but not so often because he is not very familiar with braille. During the class, the teacher speaks Thai, as all class members are Thais. (2) When the teacher writes notes on the blackboard, Golf does not know what the teacher writes. Golf sometimes asks a friend to read it for him. Also, (3) when the teacher refers to material by pointing at the board, Golf does not know what the teacher is pointing towards. Sometimes, (4) the teacher asks questions related to information on the board, but Golf is not able to answer as he does not understand the question as he cannot see the board. Sometimes (5) the teacher gives students a hard copy case study to read and analyse in class individually. Golf cannot read it so the teacher allows Golf to work in a pair. Golf mentions to the teacher that if she provides him a word file or information on the web then he will be able to read it, but the teacher tells him that she only has a .pdf file.
At the end of the class, (6) the teacher shows an important book that every student needs to read. Golf is not sure what is the book looks like, so he asks the teacher if he can touch the book and feel it’s size and thickness. Despite not receiving any financial support from friends or the university, he normally incurs considerable personal expense by having friends or professionals type the books and convert them into text files. While expensive, this is required, as otherwise the lack of accessible materials would result in him failing the course. (7) Golf find it particularly difficult when pictures, graphs or multimedia are required.
In this scenario Golf requires on or offline problem-solving mobile devices that he can use in the class and at home. Following are potentially appropriate technologically-based adaptions, an analysis of which assists in the developer choosing practical solutions.
Action 1: Golf records teacher voice in the noisy environment
Interaction issues: (P-T-P) Golf unable to hear the recording properly with background noise
Possible solutions:
-
i.
(P-T-P) Teacher uses microphone when talk to students that can reduce the noise.
Action 2: Teacher writes/draw on board
Interaction issues: (P-T-P) Golf unable to see/understand what is being written/drawn on board
Possible solutions:
-
i.
(P-T-P) Teacher only uses pre-prepared accessible slides which Golf has access to before the lecture
-
ii.
(P-P) Teacher or another student or helper read information aloud/explain it for Golf
-
iii.
(P-T-P) Helper annotates drawing on screen with text information
-
iv.
(P-T-P) Golf uses camera focused on board with Optical Character Recognition (OCR) and Screen Reading Technology (SRT) used to read text
-
v.
(P-T-P) Teacher & Golf uses electronic whiteboard with OCR & SRT to read text
-
vi.
(P-T-P) Golf uses pre-prepared tactile diagram
-
vii.
(P-T-P) Golf uses electronic tactile display
-
viii.
(P-T-P) Golf uses OrCam MyEye, an intuitive wearable device with a smart camera to read from any surface
-
ix.
(P-T-P) Golf uses Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case
-
x.
Digital Trends http://www.digitaltrends.com/mobile/blind-technologies)
-
xi.
Hand Writing Recognition (HWR) & SRT
Changes required:
-
i.
Teacher behavior
-
ii.
Teacher or other students’ behaviour or additional helper
-
iii.
Technology with in class helper
-
iv.
Technology
-
v.
Technology
-
vi.
Technology pre-prepared by helper
-
vii.
Technology
-
viii.
Technology
-
ix.
Technology
-
x.
Technology
Action 3: Teacher points to writing/drawing on board
Interaction issues: (P-T-P with diexis) Golf unable to see/understand what is being pointed at
Possible solutions:
-
i.
(P-P) A teacher or another student or helper explains what the teacher is pointing at
-
ii.
(P-T-P) A teacher provides pre-prepared tactile diagram with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution) REF Quek and Oliveira
-
iii.
(P-T-P) A teacher uses Camera focused on board with OCR used to read text
-
iv.
(P-T-P) A teacher uses an electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)
-
v.
(P-T-P) A teacher uses an OrCam MyEye software which is an intuitive wearable device with a smart camera to read from any surface
-
vi.
(P-T-P) Golf uses an Assisted Vision Smart Glasses, a wearable device by the University of Oxford, which could be used in this case
Changes required:
-
i.
Teacher/other students: behaviour or additional helper
-
ii.
Technology pre-prepared by helper
-
iii.
Technology
-
iv.
Technology
-
v.
Technology
-
vi.
Technology
Action 4: Teacher asks a question that related to the information on a board
Interaction issues: (P-T-P with diexis) Golf unable to see/understand what is referring to
Possible solutions:
-
i.
(P-P) Teacher or another student or helper explains to what the teacher is referring.
-
ii.
(P-T-P) Teacher provides pre-prepared tactile diagram with camera tracking of teacher’s referring and haptic glove (further development is required before this can be a feasible and affordable solution)
-
iii.
(P-T-P) Teacher uses camera focused on board with OCR used to read text.
-
iv.
(P-T-P) Teacher uses an electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)
-
v.
(P-T-P) Golf uses an OrCam MyEye software which is an intuitive wearable device with a smart camera to read from any surface
-
vi.
(P-T-P) Golf uses an Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case.
Changes required:
-
i.
Teacher/other students: behaviour or additional helper
-
ii.
Technology pre-prepared by helper
-
iii.
Technology
-
iv.
Technology
-
v.
Technology
-
vi.
Technology
Action 5: Teacher gives a case study hard copy paper to Golf to read
Interaction issues: (P-T-P) Golf unable to see/understand what is being written
Possible solutions:
-
i.
(P-P) Teacher or another student or helper reads it for Golf
-
ii.
(P-T-P) Teacher uses a camera focused on board utilizing OCR for text recognition
-
iii.
(P-T-P) Teacher uses an electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)
-
iv.
(P-T-P) Golf uses an OrCam MyEye software which is an intuitive wearable device with a smart camera to read from any surface
-
v.
(P-T-P) Golf uses the Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case
-
vi.
vi. (P-T-P) Golf uses the MIT ‘FingerReader’ device software to read the book scanning text with a finger. REF Follmer et al. [25]
Changes required:
-
i.
Teacher/other students: behaviour or additional helper
-
ii.
Technology pre-prepared by helper
-
iii.
Technology
-
iv.
Technology
-
v.
Technology
-
vi.
Technology
Action 6: Teacher shows a book to students
Interaction issues: (P-T-P) Golf unable to see/understand what is being written
Possible solutions:
-
i.
(P-P) Teacher or another student or helper reads it for Golf
-
ii.
(P-T-P) Camera focused on board with OCR used to read text
-
iii.
(P-T-P) Electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)
-
iv.
(P-T-P) Use OrCam MyEye, an intuitive wearable device with a smart camera to read from any surface
-
v.
(P-T-P) Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case
-
vi.
(P-T-P) FingerReader is providing an ability to read the book by scanning text with a finger.
Changes required:
-
i.
Teacher/other students: behaviour or additional helper
-
ii.
Technology pre-prepared by helper
-
iii.
Technology
-
iv.
Technology
-
v.
Technology
-
vi.
Technology
Action 7: Teacher shows a graph/diagram to students
Interaction issues: (P-T-P) Golf unable to see/understand what is being written
Possible solutions:
-
i.
(P-P) Teacher or another student or helper reads it for Golf
-
ii.
(P-T-P) Electronic tactile display with camera tracks teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)
-
iii.
(P-T-P) Electronic file that has alt tag with detailed explanation using an audible screen reader
Changes required:
-
i.
Teacher/other students: behaviour or additional helper
-
ii.
Technology pre-prepared by helper
-
iii.
Technology
Table 5 shows a few of the suggested technologies which could be used to address these issues, and the tick or cross indicates whether it could address the requirements identified. Only the first 11 columns are shown due to space restrictions. Some of the technology suggestions are still at prototype stage and so further development would be required before considered practical and feasible.
5 Conclusion and Future Work
Interviews with experts and Thai visually impaired individuals permits extension of the TEIF Method, allowing developers to create technological solutions, thereby facilitating visually impaired individuals’ interactions with people, technologies and objects. Planned future research will evaluate its use with developers and visually impaired students at Suratthani Rajabhat University, Surat Thani, Thailand.
References
Angkananon, K., Wald, M., Gilbert, L.: Developing and evaluating a technology enhanced interaction framework and method that can enhance the accessibility of mobile learning. Themes Sci. Technol. Educ. 7(2), 99–118 (2014)
http://www.specialeducationguide.com/disability-profiles/visual-impairment
http://www.ilo.org/wcmsp5/groups/public/—ed_emp/—ifp_skills/documents/publication/wcms_112307.pdf
Golledge, R.G.: Wayfinding behavior: Cognitive mapping and other spatial processes. JHU Press (1999)
Milne, J.L., Goodale, M.A., Thaler, L.: The role of head movements in the discrimination of 2-d shape by blind echolocation experts. Atten. Percept. Psychophys. 76(6), 1828–1837 (2014). Masateru Minami, Yasuhiro Fukuju, Kazuki Hirasawa
Wallmeier, L., Wiegrebe, L.: Self-motion facilitates echo-acoustic orientation in humans. R. Soc. Open Sci. 1(3), 140185 (2014)
Watthanasak, J.: Unpublished interim PhD Report University of Southampton, UK (2016)
Williams, M.A., Hurst, A., Kane, S.K.: Pray before you step out: describing personal and situational blind navigation behaviors. In: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, p. 28. ACM (2013)
Williams, M.A., Galbraith, C., Kane, S.K., Hurst, A.: Just let the cane hit it: how the blind and sighted see navigation differently. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 217–224. ACM (2014)
Guidedogs.org.au. Frequently asked questions - guidedogs SA/NT (2016). https://www.guidedogs.org.au/frequently-asked-questions. Accessed 5 Nov 2016
Guidedogs.org.uk. Are dogs allowed everywhere? - All access areas|guide dogs (2016). https://www.guidedogs.org.uk/supportus/campaigns/access-all-areas/are-dogs-allowed-everywhere. Accessed 5 Nov 2016
Zeng, L.: A survey: outdoor mobility experiences by the visually impaired. In: Mensch und Computer 2015–Workshopband (2015)
Finkel, M.: The blind man who taught himself to see (2012). http://www.mensjournal.com/magazine/the-blind-man-who-taught-himself-to-see-20120504. Accessed 13 Nov 2016
Google. Google - indoor maps (2016). https://www.google.co.uk/maps/about/partners/indoormaps/. Accessed 17 Dec 2016
Miao, M., Spindler, M., Weber, G.: Requirements of indoor navigation system from blind users. In: Holzinger, A., Simonic, K.-M. (eds.) USAB 2011. LNCS, vol. 7058, pp. 673–679. Springer, Heidelberg (2011). doi:10.1007/978-3-642-25364-5_48
Apple. Apple maps (2016). http://www.apple.com/ios/maps/. Accessed 17 Nov 2016
OpenStreetMap. Openstreetmap - indoor mapping (2016). http://wiki.openstreetmap.org/wiki/Indoor_Mapping. Accessed 17 Apr 2016
Kolbe, T.H., Groger, G., Plumer, L.: Citygml: interoperable access to 3D city models. In: van Oosterom, P., Zlatanova, S., Fendel, E.M. (eds.) Geo-information for Disaster Management, pp. 883–899. Springer, Heidelberg (2015)
Li, K.J., Lee, J.Y.: Basic concepts of indoor spatial information candidate standard indoorgml and its applications. J. Korea Sp. Inf. Soc. 21(3), 1 (2013)
Lee, J., Li, K.J., Zlatanova, S., Kolbe, T.H., Nagel, C., Becker, T.: Ogc R indoorgml (2014)
Ryu, H.-G., Kim, T., Li, K.-J.: Indoor navigation map for visually impaired people. In: Proceedings of the Sixth ACM SIGSPATIAL International Workshop on Indoor Spatial Awareness, pp. 32–35. ACM (2014)
Indoo.rs. indoo.rs guides blind travellers at san francisco international airport (2015). http://indoo.rs/sfo/. Accessed 13 Nov 2016
Wifarer. Wifarer - indoor positioning | indoor gps | location analytics (2016). http://wifarer.com. Accessed 13 March 2016
Quek, F., Oliveira, F.: Enabling the blind to see gestures. ACM Trans. Comput. Hum. Interact. 20(1), 4 (2013)
Follmer, S., Leithinger, D., Olwal, A., Hogge, A., Ishii, H.: inFORM: dynamic physical affordances and constraints through shape and object actuation. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST 2013), pp. 417–426. ACM, New York (2013)
Acknowledgement
This research was funded by The Thailand Research Fund.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Angkananon, K., Wald, M. (2017). Technology-Enhanced Accessible Interactions for Visually Impaired Thai People. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human–Computer Interaction. Designing Novel Interactions. UAHCI 2017. Lecture Notes in Computer Science(), vol 10278. Springer, Cham. https://doi.org/10.1007/978-3-319-58703-5_17
Download citation
DOI: https://doi.org/10.1007/978-3-319-58703-5_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58702-8
Online ISBN: 978-3-319-58703-5
eBook Packages: Computer ScienceComputer Science (R0)