1 Introduction

The Technology Enhanced Interaction Framework (TEIF) Method was developed to technologically enhance accessible interactions between people, technology, and objects, particularly in face-to-face situations involving people with disabilities. It was successfully validated by three developer experts, three accessibility experts, and an HCI professor for use with hearing impaired people [1]. The TEIF Method supports other design methods by providing multiple-choice questions to help identify requirements, the answers to which help provide technological suggestions that support the design stage. This paper described how the TEIF Method has been extended for use with visually impaired people. Ten experts, 20 visual impaired students and 10 visual impaired adults were interviewed in order to create scenarios, and investigate problems of visual impairment problems and their subsequent technological solutions.

2 Literature Review About Visual Impaired Requirements

The Individuals with Disabilities Education Act (IDEA) defines the term “visual impairment” as impairment in vision that, even with correction, adversely affects a child’s educational performance. The term includes both partial sight and blindness [2]. Nearly 11% of Thailand’s registered disabled population in 1996 had a visual

Impairment, and the National Statistics Office 2007 data estimates that nearly two million women and men in Thailand, or approximately 3% of the population, or 2209,000 people had a registered visual impairment disability, [3].

To reduce discrimination in access requires accessible technology solutions, an accessible environment, accessible documents and accessibility awareness.

2.1 How Can Blind People Get Information?

Golledge [4] analyzed the four senses involved in a navigation task:

  1. 1.

    Touch is a tactile perception ability which acquires information from objects pressed against the skin, using Mechanoreceptors. These utilize neural receptors to detect a pressure on human skin when it is touched e.g. pressure on hands, feet, follicle, tongue and body skin.

  2. 2.

    Sight is vision perception, which includes focusing, interpreting, and detecting visible light that bounces off and reflects from objects into the eyes. It provides information such as images, colours, brightness, and contrast.

  3. 3.

    Audition is sound perception, namely the ability to detect and interpret the vibration into various frequencies of noise in the inner ears. Hearing capability using both ears also provides the ability to echolocate, that is, detect orientation of the sound source. (Milne et al. [5]; Wallmeier and Wiegrebe [6]).

  4. 4.

    Olfaction is Odour perception, ability to smell objects in the environment, utilizing the Olfactory neural receptor.

Visual impairment obstruction detection at different levels is shown in Table 1. For example, visually impaired people use a white can to detect obstructions at ground level. They use guide dogs to avoid the obstruction and/or sighted people to avoid the obstruction.

Table 1. Shows the relationship between activities and internal perceptions (Watthanasak [7] via Williams et al. [8])

2.2 Problem and Solutions of Visual Impairment

Problems and solutions experienced by the visually impaired and blind. People are shown in Table 2.

Table 2. Problems and solutions experienced by visually impaired and blind.

3 TEIF Interactions

Table 3 shows the five TEIF interactions while Fig. 1 shows the TEIF architecture. Someone pointing at something while referring to it as this, it is an example of Diexis.

Table 3. Interactions and Communication in the Technology Enhanced Interaction Framework
Fig. 1.
figure 1

The TEIF Architecture.

4 Developing TEIF for Blind

4.1 Interviews

The research analyses the information gathered from the experts and visually impaired people to develop requirement questions, five possible scenarios with actions, interaction issues, and possible technologies.

4.2 Transforming Requirements into Questions and Multiple Choices

The TEIF Method helps developers gather and evaluate requirements by using TEIF based multiple-choice questions. The questions help identify issues for which a technology solution is required.

In the following example, requirement questions ☐ means more than one answer can be chosen and ◯ means only one answer can be chosen. The example requirement questions which are shown below only include questions for which correct answers in the given scenario are provided.

4.3 Develop Scenario to Test Requirement Questions and Multiple Choices

In order to ensure that the TEIF has broad applicability,, five scenarios and technology solutions were considered during the development process: a blind person shops for groceries, crosses the road, finds rooms and buildings, studies at the University, and visits the Shadow Puppet Museum. The process illustrated TEIF suitability in these complex situations involving visual imparement, and addressed the specific aspects of these technologically- enhanced interactions.

Table 4 shows how the questions can be applied to the relationship between the multiple-choice requirement questions and answers for these five scenarios.

Table 4. How questions can be applied for five scenarios

Scenario 4: Problem of a blind students studying at the University

Space limitations allow only one of the scenarios to be described in detail. “Golf” is the only blind student in the law faculty class. Golf normally sits in a front of the class as he wants to record the lectures. However, (1) there is a lot of noise as teachers do not use microphones and other students are also talking during the class. Therefore, the sound quality of the media file that he records is not so good. Golf uses Braille to take notes from the lecture sometimes but not so often because he is not very familiar with braille. During the class, the teacher speaks Thai, as all class members are Thais. (2) When the teacher writes notes on the blackboard, Golf does not know what the teacher writes. Golf sometimes asks a friend to read it for him. Also, (3) when the teacher refers to material by pointing at the board, Golf does not know what the teacher is pointing towards. Sometimes, (4) the teacher asks questions related to information on the board, but Golf is not able to answer as he does not understand the question as he cannot see the board. Sometimes (5) the teacher gives students a hard copy case study to read and analyse in class individually. Golf cannot read it so the teacher allows Golf to work in a pair. Golf mentions to the teacher that if she provides him a word file or information on the web then he will be able to read it, but the teacher tells him that she only has a .pdf file.

At the end of the class, (6) the teacher shows an important book that every student needs to read. Golf is not sure what is the book looks like, so he asks the teacher if he can touch the book and feel it’s size and thickness. Despite not receiving any financial support from friends or the university, he normally incurs considerable personal expense by having friends or professionals type the books and convert them into text files. While expensive, this is required, as otherwise the lack of accessible materials would result in him failing the course. (7) Golf find it particularly difficult when pictures, graphs or multimedia are required.

In this scenario Golf requires on or offline problem-solving mobile devices that he can use in the class and at home. Following are potentially appropriate technologically-based adaptions, an analysis of which assists in the developer choosing practical solutions.

Action 1: Golf records teacher voice in the noisy environment

Interaction issues: (P-T-P) Golf unable to hear the recording properly with background noise

Possible solutions:

  1. i.

    (P-T-P) Teacher uses microphone when talk to students that can reduce the noise.

Action 2: Teacher writes/draw on board

Interaction issues: (P-T-P) Golf unable to see/understand what is being written/drawn on board

Possible solutions:

  1. i.

    (P-T-P) Teacher only uses pre-prepared accessible slides which Golf has access to before the lecture

  2. ii.

    (P-P) Teacher or another student or helper read information aloud/explain it for Golf

  3. iii.

    (P-T-P) Helper annotates drawing on screen with text information

  4. iv.

    (P-T-P) Golf uses camera focused on board with Optical Character Recognition (OCR) and Screen Reading Technology (SRT) used to read text

  5. v.

    (P-T-P) Teacher & Golf uses electronic whiteboard with OCR & SRT to read text

  6. vi.

    (P-T-P) Golf uses pre-prepared tactile diagram

  7. vii.

    (P-T-P) Golf uses electronic tactile display

  8. viii.

    (P-T-P) Golf uses OrCam MyEye, an intuitive wearable device with a smart camera to read from any surface

  9. ix.

    (P-T-P) Golf uses Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case

  10. x.

    Digital Trends http://www.digitaltrends.com/mobile/blind-technologies)

  11. xi.

    Hand Writing Recognition (HWR) & SRT

Changes required:

  1. i.

    Teacher behavior

  2. ii.

    Teacher or other students’ behaviour or additional helper

  3. iii.

    Technology with in class helper

  4. iv.

    Technology

  5. v.

    Technology

  6. vi.

    Technology pre-prepared by helper

  7. vii.

    Technology

  8. viii.

    Technology

  9. ix.

    Technology

  10. x.

    Technology

Action 3: Teacher points to writing/drawing on board

Interaction issues: (P-T-P with diexis) Golf unable to see/understand what is being pointed at

Possible solutions:

  1. i.

    (P-P) A teacher or another student or helper explains what the teacher is pointing at

  2. ii.

    (P-T-P) A teacher provides pre-prepared tactile diagram with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution) REF Quek and Oliveira

  3. iii.

    (P-T-P) A teacher uses Camera focused on board with OCR used to read text

  4. iv.

    (P-T-P) A teacher uses an electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)

  5. v.

    (P-T-P) A teacher uses an OrCam MyEye software which is an intuitive wearable device with a smart camera to read from any surface

  6. vi.

    (P-T-P) Golf uses an Assisted Vision Smart Glasses, a wearable device by the University of Oxford, which could be used in this case

Changes required:

  1. i.

    Teacher/other students: behaviour or additional helper

  2. ii.

    Technology pre-prepared by helper

  3. iii.

    Technology

  4. iv.

    Technology

  5. v.

    Technology

  6. vi.

    Technology

Action 4: Teacher asks a question that related to the information on a board

Interaction issues: (P-T-P with diexis) Golf unable to see/understand what is referring to

Possible solutions:

  1. i.

    (P-P) Teacher or another student or helper explains to what the teacher is referring.

  2. ii.

    (P-T-P) Teacher provides pre-prepared tactile diagram with camera tracking of teacher’s referring and haptic glove (further development is required before this can be a feasible and affordable solution)

  3. iii.

    (P-T-P) Teacher uses camera focused on board with OCR used to read text.

  4. iv.

    (P-T-P) Teacher uses an electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)

  5. v.

    (P-T-P) Golf uses an OrCam MyEye software which is an intuitive wearable device with a smart camera to read from any surface

  6. vi.

    (P-T-P) Golf uses an Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case.

Changes required:

  1. i.

    Teacher/other students: behaviour or additional helper

  2. ii.

    Technology pre-prepared by helper

  3. iii.

    Technology

  4. iv.

    Technology

  5. v.

    Technology

  6. vi.

    Technology

Action 5: Teacher gives a case study hard copy paper to Golf to read

Interaction issues: (P-T-P) Golf unable to see/understand what is being written

Possible solutions:

  1. i.

    (P-P) Teacher or another student or helper reads it for Golf

  2. ii.

    (P-T-P) Teacher uses a camera focused on board utilizing OCR for text recognition

  3. iii.

    (P-T-P) Teacher uses an electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)

  4. iv.

    (P-T-P) Golf uses an OrCam MyEye software which is an intuitive wearable device with a smart camera to read from any surface

  5. v.

    (P-T-P) Golf uses the Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case

  6. vi.

    vi. (P-T-P) Golf uses the MIT ‘FingerReader’ device software to read the book scanning text with a finger. REF Follmer et al. [25]

Changes required:

  1. i.

    Teacher/other students: behaviour or additional helper

  2. ii.

    Technology pre-prepared by helper

  3. iii.

    Technology

  4. iv.

    Technology

  5. v.

    Technology

  6. vi.

    Technology

Action 6: Teacher shows a book to students

Interaction issues: (P-T-P) Golf unable to see/understand what is being written

Possible solutions:

  1. i.

    (P-P) Teacher or another student or helper reads it for Golf

  2. ii.

    (P-T-P) Camera focused on board with OCR used to read text

  3. iii.

    (P-T-P) Electronic tactile display with camera tracking of teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)

  4. iv.

    (P-T-P) Use OrCam MyEye, an intuitive wearable device with a smart camera to read from any surface

  5. v.

    (P-T-P) Assisted Vision Smart Glasses, a wearable device by the University of Oxford, could be used in this case

  6. vi.

    (P-T-P) FingerReader is providing an ability to read the book by scanning text with a finger.

Changes required:

  1. i.

    Teacher/other students: behaviour or additional helper

  2. ii.

    Technology pre-prepared by helper

  3. iii.

    Technology

  4. iv.

    Technology

  5. v.

    Technology

  6. vi.

    Technology

Action 7: Teacher shows a graph/diagram to students

Interaction issues: (P-T-P) Golf unable to see/understand what is being written

Possible solutions:

  1. i.

    (P-P) Teacher or another student or helper reads it for Golf

  2. ii.

    (P-T-P) Electronic tactile display with camera tracks teacher’s pointing and haptic glove (further development is required before this can be a feasible and affordable solution)

  3. iii.

    (P-T-P) Electronic file that has alt tag with detailed explanation using an audible screen reader

Changes required:

  1. i.

    Teacher/other students: behaviour or additional helper

  2. ii.

    Technology pre-prepared by helper

  3. iii.

    Technology

Table 5 shows a few of the suggested technologies which could be used to address these issues, and the tick or cross indicates whether it could address the requirements identified. Only the first 11 columns are shown due to space restrictions. Some of the technology suggestions are still at prototype stage and so further development would be required before considered practical and feasible.

Table 5. Technology suggestion table

5 Conclusion and Future Work

Interviews with experts and Thai visually impaired individuals permits extension of the TEIF Method, allowing developers to create technological solutions, thereby facilitating visually impaired individuals’ interactions with people, technologies and objects. Planned future research will evaluate its use with developers and visually impaired students at Suratthani Rajabhat University, Surat Thani, Thailand.