Keywords

1 Introduction

This paper discusses the usability evaluation of a mobile navigation application for people who are blind. The mobile Audio-Based Environments Simulator (mAbES) was evaluated with the participation of HCI specialists, experts in video gaming and mAbES blind end-users. The evaluation included whether or not mAbES would allow the user to recognize the Science and Technology Museum building at the Pontifical Catholic University of Rio Grande do Sul (MCT/PUCRS) and, consequently, whether they could move in this environment from the virtual interaction experience using mAbES. The main research question was whether mAbES could assist a sightless user in understanding the real space that it represents without replacing the visitation to the museum.

Following AbES’s functionality [1], mAbES is a mobile audio-haptic environments simulator to assist the refinement of orientation and mobility (O&M) skills in people who are blind. Thus, during navigation the end-user can develop O&M skills that are validated through the construction of mental maps [2]. The navigation capability is related to a person’s ability to safely move from one point of origin to a destination.

Vision-based navigation is more of a perceptual process, whereas blind navigation is more cognitively demanding and often requires conscious moment-to-moment problem solving [3]. A person with visual impairment must be competent with orientation and mobility in order to achieve a solid level of navigation, including moving about safely, efficiently and agilely, as well as independently in both familiar and unfamiliar environments [4]. The learning of O&M skills includes a set of predefined techniques that blind or visually impaired children, young people and adults must practice stage by stage. However, learning such skills also involves other aspects such as training and refining perception systems and developing both conceptual and motor skills [2, 4].

Support on a perceptual and conceptual level is important for the development of orientation skills and the construction of cognitive maps [1, 2, 5]. The notion of a map refers to an internalized representation of space, a mixture of objective knowledge and subjective perception.

If real-life surroundings are represented through virtual environments, it is possible to create several training applications that allow a blind user to interact with the elements in the simulated environment during navigation [6, 7].

There are different technological resources that have been developed to aid navigation, and thus allow blind users to better understand the world around them. There are vibrating canes, tactile models, GPS-based applications, indoor environment simulators, RFID (radiofrequency identification) tags, and camera image streaming to a central server via cell phone to process unknown environmental features. There are others navigational technologies available to unseeing users that focus on large scale blind navigation, unfamiliar environments and well-known spaces, besides being potentially useful to users with low vision [3]. The mAbES software was employed including a multimodal interface integrating information feedback via audio and haptic responses. mAbES uses a gaming interaction model proposed by [8, 9] to analyze the barriers that a blind user faces when using a game. The application uses click-based interaction on a Braille matrix, which is represented on a smartphone screen.

There is a research effort towards building interactive systems that can be used autonomously by people who are blind [13, 5, 711] and that are easy and simple to use. The terms ‘usability’ and ‘accessibility’ are related and should be borne in mind along the stages of design, development and evaluation of computer applications [10]. In the mAbES usability evaluation, related accessibility issues were included.

There are different categories of usability and accessibility evaluation methods [12]:

  • Automated verification of compliance with guidelines and standards

  • Evaluations conducted by experts

  • Evaluations using models and simulations

  • Evaluations with users or potential users

  • Evaluations of data collected during eSystem usage

mAbES’s evaluation was conducted with Human-Computer Interaction and video gaming experts in Group 1 and users, or potential users, in Group 2. The goal of the evaluation was to verify mAbES’s adequacy to usability criteria (Group 1) and to check that mAbES can help visually impaired people understand a real space that is represented in a virtual environment (Group 2).

The instruments used by Group 1 were the audio feedback questionnaire and the usability evaluation questionnaire. The former identifies the degree of understanding of audio as well as the conformity of the audios to the Brazilian specifications requirements for the description of images in the design of accessible digital material [13]. The latter includes questions that aim to gather information about the suitability of mAbES to the heuristics [14]. It also includes questions based on video games and gaming mechanics [1, 3, 811, 15].

For Group 2, the audio evaluation and the O&M questionnaires were applied. The first aimed at verifying if the user could be oriented in space and perceive the objects around them by hearing audio information. The second questionnaire contained questions related to O&M, tactile feedback, ease of use and user satisfaction. In addition to the questionnaires, semi-structured interviews were also carried out.

Tasks were given to the participants of the two groups. Group 1’s tasks referred to the use of mAbES without the users been physically in the Museum of Science and Technology of the Pontifical Catholic University of Rio Grande do Sul. Group 2 was organized into two subgroups because some users used mAbES before as well as during the visit to the Museum, whereas others used it only during the visit to the Museum. During the tasks performance, the use of mAbES was videotaped. In the case of Group 2, the videotape served to verify the effect of the use of mAbES on users’ perception of the environment, alongside with their behavior when interacting with the technology. After performing the tasks, all participants reproduced the mental map of the museum using concrete material.

Users’ perception and interaction behavior identified in the study and the data analysis allowed for refining the methodology used for evaluating mAbES’s usability, proposing suggestions for improvements in the use of this application and making recommendations for developing video games for people who are blind for navigation purposes. These issues, together with a description and analysis of the results of the usability evaluation, are presented and discussed in the next sections.

2 AbES

mAbES - mobile Audio-Based Environments Simulator - is a videogame based on AbES [1, 15]. AbES replicates a real, familiar or unfamiliar environment to be navigated by a person who is blind (Fig. 1). The virtual environment is made up of different elements and objects (walls, stairways, escalators, doors, toilets or elevators) through which the user can discover and become familiar with their location. AbES includes three modes of interaction: free navigation, path navigation and game mode.

Fig. 1.
figure 1

A screenshot of AbES (Source: Sánchez et al., 2009)

The user receives audio feedback from the left, center and right side channels, and all actions are carried out through the use of a traditional keyboard, where each set of keys has different associated actions. All of the actions in the virtual environment have a particular sound associated to them.

3 mAbES

In the development of mAbES, members of the museum team took part in meetings in order to define the scope and to prioritize the functionalities of the software. Then 3 experiments were chosen: Nuclear Power Station (Fig. 2), Energy Train, and Cool House, located on the third floor of the museum.

Fig. 2.
figure 2

(a) Picture of nuclear power station (b) (c) screenshot of nuclear power station

Interface. mAbES is a multisensory (auditory, haptic, graphic) virtual environment simulating real-life space.

The audio interface is responsible for conveying museum information as well as information resulting from user interaction with mAbES. There are four types of audio feedback: (i) when the user collides with an object, the sound is triggered identifying this object, for example “This is an escalator”. mAbES also provides an audio-description about its physical appearance, its operation and how the user should use the escalator in the real context of the museum; (ii) during navigation, there is a set of sounds associated with the objects. For example, if the user walks, the step sound cue can be heard; (iii) the instructive component has an audio description of the selected MCT-PUCRS experiments; (iv) while the user interacts with the experiments, they are presented with quizzes/challenges that must be answered.

The haptic interface consists of vibration feedback provided by the smartphone. Every time the user bumps into an object, they feel an intermittent vibration on their hand.

The Graphic Interface represents the characteristics of the museum: size, shape and position of the spaces, the selected experiments (Nuclear Power Station, Energy Train, and Cool House) and objects (escalators, walls, chairs, tables, shelves, etc.). This graphical representation allows mAbES to be used by low-level vision and sighted people alike.

Interaction. The user communicates with the software by interacting with a smartphone screen, which utilizes an array of points of the Braille system (Fig. 3). The movement of the user through the museum is achieved by using the forward button, which represents the user’s individual steps. The right and left buttons are used when the user turns either direction (Fig. 4).

Fig. 3.
figure 3

(a) Braille matrix (b) (c) screenshot of the menu

Fig. 4.
figure 4

(a)(b)(c) Screenshot of transitions between floors in MCT-PUCRS

When the user arrives at the third floor, mAbES presents the experiments that are mapped so that the user can choose which one they want to interact with: 1 – Nuclear Power Station, 2 – Energy Train, 3 – Cool House, 4 – Explore the space freely, 5 – More information, 6 – Exit, according to the Braille matrix. When the user comes upon any experiment (Nuclear Power Station, Energy Train, Cool House), mAbES informs the user by naming the object or experiment, and the options or quizzes/challenges that are available to the user. Information on the museum or the experiments is available to the user in audio format. The options for hearing the audio cues are: 1 – Play, 2 – Pause, 3 – Increase speed, 4 – Go back, 5 – Go forward, 6 – Help.

4 METHOD

4.1 Sample

  • Group 1: an intentional sample was selected, made up of 3 HCI specialists and 2 experts in video gaming for blind users.

  • Group 2: the sample selected for the use with mAbES was made up of 6 learners (3 female; 3 male), of which 2 are between 10 and 14 years old and 2 are between 20 and 36 years old. The 2 remaining users are 44 years old. Across the entire sample, 5 learners were totally blind and 1 person had low vision. All of them were legally blind. This sample was divided into 2 subgroups of 3 users. Subgroup 1 visited MCT without having used mAbES beforehand. Subgroup 2 used mAbES before visiting the museum. The requirement to participate was to not be acquainted with the Museum of Science and Technology, of the Pontifical Catholic University of Rio Grande do Sul.

4.2 Instruments

Sound Evaluation Instrument. Assessment of sound information occurred in two stages: (i) by the authors of this work and (ii) with the participants of Group 1 and Group 2.

The authors of this study evaluated mAbES’s sounds from the Technical Note [13]. The authors assessed whether the sounds used in mAbES met the specifications of the Technical Note. In the case of non-compliance, it was indicated that the requirement was not fulfilled and a sound transcription was suggested (Table 1).

Table 1. Example of the comparison between mAbES’s sounds

Of the 156 sounds in mAbES, 30 did not meet what is specified in the Technical Note. Afterwards, a selection was made to discard sounds that referred to the same unmet requirement. Finally, 16 sounds were selected to compose the audio test instrument with the participants in Group 1 and Group 2.

In the second stage, the participants in groups 1 and 2 evaluated the sounds. The instrument contained the identification of the sound cue, the number of times it was run by the user (so as to indicate their preference for the original or the suggested version) and an optional field to include comments on each sound.

Usability Evaluation Tool. Usability evaluation instruments for groups 1 and 2 were based on [1, 11].

  • Group 1: 35 questions, of which 28 were based on ten Usability Heuristics for User Interface Design [14], 4 were related to the haptic interface and 3 referred to the sound interface.

  • Group 2: 22 closed questions, of which 1 was related to menus, 3 were related to the sound interface, 4 were related to the haptic interface, 2 were related to the graphical interface, 7 were related to ease of use, 1 was related to the Braille matrix and 4 were related to satisfaction. It also had 5 open questions associated with ease of use and user satisfaction.

Evaluation Tool for Orientation and Mobility (O&M). After using mAbES, participants in both groups had to draw the museum’s environment. In Group 2, a sheet of paper was used on synthetic foam to enable the drawing to be traced for touch recognition.

4.3 Procedure

Evaluation of Sound Interface. In the evaluation of audio, Group 1 heard the original audio and the audio suggested for each selected sound in mAbES. During execution, they replied to the sound assessment questionnaire.

For respondents in Group 2, the procedure differed in completing the questionnaire, which was conducted with the aid of the authors of this work. The Technical Note [13] was made available in Braille for use as reference.

Usability Evaluation - Orientation and Mobility Evaluation. Participants in Group 1 and Group 2 were explained what mAbES is and the context in which it appears. The application can be used without time restriction.

  • Group 1: after using mAbES freely, users drew the museum’s environment and gave answers to the usability evaluation tool.

  • Group 2: they received the following task: You are on the ground floor of the museum, near the entrance. You should go to the third floor, and explore the Nuclear Power Station experiment. Afterwards, you should explore the Cool House experiment. In Nuclear Power Station, you should hear the information and respond to challenges. In Cool House, you should come in and see what’s in the room.

    • Subgroup 1 used mAbES directly at the museum, without having previously used the software. Then they answered the usability evaluation tool and made the graphic representation of the environment they had visited.

    • Subgroup 2 used mAbES and answered the usability evaluation tool and graphically represented the environment of the museum. Only then did they visit the Museum and later they should confirm their answers and, once again, draw their graphic representation. Users in Group 2 (Subgroup 1 and Subgroup 2) were allowed to use headphones during the test, in which case they were to report out loud what happened while moving about with mAbES (Fig. 5).

      Fig. 5.
      figure 5

      A user who is blind interacting with mAbES

5 Results

Evaluation of Sound Interface.

  • Group 1: considering the 3 experts in HCI, only one participant chose a set of 5 original sounds instead of their suggestions. The other experts preferred the suggested versions. All experts in video games preferred the suggested sounds. According to their accounts, the option suggested by the sounds was compliant with [13], by which objects are thoroughly detailed.

  • Group 2: despite having a strong preference for suggested sound (71 %) in place of the original sound (29 %), according to participants, compliance with the Technical Note [13] made some sounds have excessive information.

Usability Evaluation. For data analysis, the categories ‘Strongly agree’ and ‘Partly agree’ were grouped into ‘Agree’ and the categories ‘Strongly disagree’ and ‘Partly disagree’ were grouped into ‘Disagree’.

  • Group 1: the instrument was organized considering the ten Usability Heuristics for User Interface Design [14], the sound interface and the haptic interface. The result can be seen in Table 2.

    Table 2. Usability evaluation – group 1
  • Group 2: out of the total of 57 questions, Subgroup 1 agreed with 88.2 % and disagreed with 8.5 %. They were neutral about 3.3 % of the questions. With regard to Subgroup 2, there were differences between pre-test and post-test:

    • Pre-test: 80.3 % agree, 10.8 % disagree and 8.9 % neutral.

    • Post-test: 82.5 % agree, 10.5 % disagree and 7 % neutral.

The answers to the open questions, along with observing mAbES use and the visit to the museum, allowed for a few remarks:

  • Rotation: the turns are made based on the hours on a clock. Thus, a 90° rotation requires 3 clicks on the right button. This similarity should be more strongly emphasized.

  • Graphical interface: the information should be adjustable and customizable to enable the user to obtain more information on the Museum.

  • Haptic feedback: it should be adjustable to the user’s preferences. Research has shown that this feature was more useful during the first interactions. After them, users have grasped more from the sound information.

  • Help system: it should be contextual to help the user depending on their current virtual location in the museum as well as on the challenges that require responses.

  • Routes: when the user leaves an expected route, the application should provide support so that they could recognize the environment and return to the desired point.

  • User position: mAbES needs a resource to indicate the user’s location in the virtual museum space and should provide information so that they could recognize the space around themselves.

Orientation and Mobility Evaluation.

  • Group 1: users were able to understand the Museum space from the use of mAbES and reproduce it in different ways (Fig. 6).

    Fig. 6.
    figure 6

    (a) Representation of museum space by the authors (b)(c) group 1: examples of museum representations.

  • Group 2: the majority of participants had no experience in drawing (Fig. 7 Subgroup 1 and Fig. 8 Subgroup 2). Participants of Subgroup 2 could refine their understanding of space when using mAbES before making the visit to the museum (Fig. 8).

    Fig. 7.
    figure 7

    (a)(b)(c) Group 2 – subgroup 1: examples of museum representations

    Fig. 8.
    figure 8

    (a)(b) Group 2 – subgroup 2: drawings made during pre- and post-test stages

Considerations for the Development of Applications to Support Navigation to Users who are Blind. This work allowed making a few suggestions, which may be considered when designing applications for blind users:

  • Use of audio cues to describe the images and spaces that are represented in the application.

  • Enable control of the user to monitor sound and haptic information.

  • Add a contextual help system that can help the user recognize the space in which they are as well as the activities that must be carried out.

  • Prevent error that can be triggered when the user performs an action not expected by the software.

  • Prioritize sound information over detailing the graphical interface.

  • Maintain a standard throughout the actions expected by the user with regard to interacting with the software.

6 Conclusion and Future Work

This paper deals about usability evaluation of a mobile navigation application for users who are blind: mAbES. The evaluation of mAbES was carried out with experts in Human-Computer Interaction and in video gaming (Group 1) and users or potential users (Group 2). Group 1 indicated that mAbES conforms to most usability criteria defined by Nielsen [14].

Group 2 had participants who had never drawn and still been able to establish spatial relationships between the experiments and the space they occupy in the museum. Usage of mAbES before the visit allowed users to explore the museum more autonomously and safely.

Groups 1 and 2 also indicated that most of the original mAbES sounds did not meet that which is specified in the Technical Note [13]. They approved the suggested sounds.

The results obtained regarding O&M skills in blind users who interacted with mAbES demonstrated the positive impact of the software on such skills. Users who are blind understand the space of the museum and interact with the environment based on their use of the software.

This research also collaborates with the design and development of similar applications, and it makes suggestions and precautions that should be considered for users who are blind to better use a system based on sound, haptic and graphic interfaces.