Keywords

1 Introduction

The recent advances in Ambient Assistive living technologies [1] bear great potentiality for realizing effective and efficient social inclusion of peoples with disabilities (PWD). Instead of being a heavy burden on the nationsā€™ economy and society, they can become productive through designing adequately smart workplace environment full of intelligent devices and interfaces in order for these PWDs to smartly access, control and interact with these objects and carry out normal personā€™s office tasks. This can be achieved through smartly integrating advanced human computer interaction (HCI) technologies within all objects of the workplace environment.

In fact, one important justification of this article and its related previously linked research efforts from the authors, is that, in spite of its significance on the nationā€™s economy and its social dimensions, the issue of inclusion of PWDs with workplaces environment have not been tackled in previous researches concerned with developing smart environment for disable persons. Most work have focus on home or outdoor assistive technologies. But still obvious neglection of significance of social inclusion of PWDs at workplace. Over and above, most developed smart solutions and assistive technologies were not comprehensive, as they address specific or combination cases of disability, such as smart solutions for blind or low vision persons, physically disabled persons (PDP), deaf or mute persons, and mostly with home or building places.

This paper is organized as follows. In next section, we cover the literature review of relevant technological solution for PWD. SectionĀ 3, presents and describes the SMARTUNIVERS interaction and control liked to its interface module windows. Then in Sect.Ā 4, we describe the major guidelines and standards of interface design process upon which the design of SMARTUNIVERS is based and their realizations, and finally a conclusion is given in Sect.Ā 5.

2 Literatures Review

In this section, we review the literatures relevant to smart solutions for PWD, with particular focus on the data and information models developed for smart applications. Workplace environment Related to the utilization of assistive technology to aid persons with neuromuscular disabilities, Chang et al. [2] used a location-based task prompting system to assess the possibility of training two individuals with cognitive impairments in a supported employment program. They concluded that data showed that the two participants significantly increased their target response, thus improving vocational job performance during the intervention phases. Hakobyan et al. [3] conducted a research on making mobile phones and other handheld devices accessible via touch and audio sensory channels for the visually impaired persons. Kbar & Aly [4] proposed smart assistive technologies at the workplace consisting of an integrated and connected set of smart software and hardware technologies to empower the Physically disabled persons with the capability to access and effectively utilize the ICTs in order to execute knowledge rich working tasks with minimum of effort and with sufficient comfort level. Their proposed technology solution for PWD includes smart help and smart editors through using voice recognition that enables them to edit and document their work smartly through animating of the mouse cursor movement to track the editing without the need to use their hands. It also enables PWD to get help from the network using smart help engine that is based on voice recognition.

Relevant to the utilization of smart software to empower the PWD in the workplace, Chang et al. [5] assessed the possibility of training three people with cognitive impairments using a computer-based interactive game. They designed a game to provide task prompts in recycling scenarios, identify incorrect task steps on the fly, and help users learn to make corrections. The results showed that the three participants considerably increased their target response, which improved their vocational job skills during the intervention phases and enabled them to maintain the acquired job skills after intervention. Angkananon et al. [6] focused on designing accessible mobile learning interactions involving disabled people using a newly developed Technology Enhanced Interaction Framework. Their framework was developed to help design technological support for communication and interactions between people, technology, and objects particularly when disabled people are involved.

Lancioni et al. [7] built a computer aided telephone to help person with motor and visual disability to make phone calls to his work college or his family. The system communicates with user through voice commands to select the person he want to call, the user activate a micro-switch by his hand to perform a call or to hang up the call. The system help visual impaired individuals to perform calls with no need to press numeric keypad which could cause miss typing of phone number. However, the system was not portable and requires the use of hands to control the micro-switch. Halawani Zaitun [8] built a software system that help deaf. The system captures the speech through speech recognition system, and digitizes the speech then converts it to Arabic sign language. The Arabic sign language output is a pre-saved images of equivalent avatar for each alphabet and words in the recognized language. Addressing the needs of motor disability, Peixto et al. [9] designed and implemented a voice control system to a wheelchair movement, the author used voice commands such as go to start chair movement and measure the frequency of humming to change the chair speed, also for controlling chair rotation user could say turn left command and then use humming to control the rotation angle. Yang et al. [10] designed a system to help people with motor disability to use blinking to control virtual keyboard. The scanning keyboard was designed to be controlled by blinking left eye and right eye simultaneously. The pseudo electromyography (EMG) signal generated from a userā€™s blink was acquired by a Bluetooth headset and transmitted to a PC through wireless transmission. Hawley et al. [11] built a prototype of voice-input voice-output system that helps individuals with speech disability to improve the conversation with other individuals. The system receives the keywords such as ā€˜wantā€™, ā€˜waterā€™, ā€˜drinkā€™ from the user then user press a button that allow system to generate a speech output which will generate ā€œcan I have a drink of water please.ā€ The final phrase is then spoken by speech synthesizer. Considering visual impaired individuals the research is targeting indoor navigation systems such in Jain [12] built a wearable device that consists of wall modules deployed in building and, user end comprising of a waist-worn device coupled with a mobile phone. The network of infrared-based wall units retrofitted at specific locations in the building. The sensors transmit the unique IR tags corresponding to their location perpendicular to the direction of motion of the user. All the information is conveyed to the user via the Text-to-Speech (TTS) engine of the mobile application, and also displayed in a large font size to provide for someone with partial vision. Vibration alerts are used to provide continuous feedback for being on the right track. Addressing the needs of individuals with speech impairment, Padmanabhan & Sornalatha [13] presented an artificial speaking system, the system depend on wearable flex sensor and accelerometer that measure the finger angles and tilting angle of hand while making the gestures (i.e., English alphabets gestures). The system recognizes the gestures and translates it to speech through a speaker output. In 2014, Jamil et al. [14], developed eye tracking system to control powered wheel chair to support impaired people who cannot drive wheel chair manually or unable to move joy-sticks because of lack of physical ability. Userā€™s eye moments were translated to screen position through a camera. Once user moves his eyeball, the wheel chair will follow the direction according to eye movement. Also, relevant to this research is the work done in [15] concerning adaptive interactive solutions, and proposed adoptive interaction support to adjust level of interaction based on quality of context, in ambient aware environment. Another related work is the development of RFID-based multi-media system [16], which involves design and experimentation of RFID-based magic stick for children use in interacting with environment for learning and game playing.

The above literature reveals that indeed, a very few if not rare work has been considered empowering the PWD with smart universal assistive technologies at the workplace. Most researches focus on specific or single impairment condition such as smart solutions for blind or low vision persons, physically disabled persons (PDP), deaf or mute persons, and mostly with home or building places.

In this paper, we will explain briefly the high level design of the universal interface called SMARTUNIVERSE that covers several kinds and combinations of disabilities. The user interface, interaction and control of the developed SMARTUNIVERS is described in the following sections.

3 Interactions and Control of SMARTUNIVERS

In this section, we introduce the interaction and control of the SMARTUNIVERS. The SMARTUNIVERS is currently being developed within the of SMARTDISABLEā€™s research project activities implemented at the Riyadh Techno Valley, King Saud University, Riyadh, KSA. IT includes two smart interface modules: Smart Help (SMARTHELP) and Smart editor (SMARTEDIT). The SMARTHELP module provides personalized smart help and communication services for the PWD at work services. The smart help services mainly enable the PWD to get help information about the locations, directions, building information, etc. The smart communications enable the PWD to make a call using Voice over IP (VOIP) with colleagues and other persons in building to ask for help or intervention. In addition the SMARTHELP also support Auto Emergency Response to assist PWD in getting immediate help through Auto Emergency server as well as getting personal assistant from caregiver.

The SMARTEDIT module is a multimodal editor interface that is provides the capability for wide spectra of PWD groups. The SMARTUNIVERS and its two component modules will make use of voice and speech recognition engines, text to speech, virtual mouse/keyboard and Braille keyboard to cover the requirements of wide spectra of PWD defined groups with various combination of impairments, including physically disabled persons (PDP), partially blind or low vision, deaf, mute and combinations of these kinds of capabilities or disability together, in a customized adaptive way. PWD user profile setup is a common part for the SMARTHELP and SMARTEDIT that supports user initial and customized profile setup and adopt the interface display parameters (color, fonts, etc.) and environments (speaker volume, Mic volume, virtual mouse, and virtual keyboard) according to the requirements of the PWD based on the predefined and stored profile.

Developing a unified interface that is relevant for many users especially for People with Disability is a challenging task as it requires knowing the needs for different group and adjust the interface accordingly. We have identified 11 groups that are relevant for PWDs to support a combination of different impairment conditions including visual, hearing, speaking, and motor impairments. Different interface parameters will be modified to satisfy these conditions including Mic and Speaker volume level, font type, color and size, window background and foreground color, and Window size for command and displaying result. TableĀ 1 presents the 11 groups according to different impairment conditions, where some of the combination has been eliminated as user must see or partially see as we donā€™t target the blind group in this project. For each group, following the PWDs group standards guidelines, a set of defined interface design parameters including text type, size and color, volume level, and window color will adjust dynamically according to different group.

TableĀ 1. The identified 11 groups of various impairments

3.1 SMARTUNIVERSā€™ User Interaction and Flow of Control

FigureĀ 1 illustrates the Use Case diagram of the SMARTUNIVERS. As indicated, There are five use cases that constitute the SMARTUNIVERS modules: Enter user profile, Customize environmental set, Start SMARTHELP, Start SMARTEDIT, Get use help. The user triggers the five use cases (red lines) and the application Admin interacts with the three use cases out of them to provide help and support to the user, and in some cases assist in user profiling and pre-established environmental setup. The SMARTUNIVERS interface communicates with an integrated speech recognition engine. It also interacts with the RFID which add automatic identification of user profile. The SMARTUNIVERSā€™s database contains all use cases related entities.

Fig.Ā 1.
figure 1

Use Case of PWD userā€™s interaction with SMARTUNIVERS functions

3.2 The Flow of Control of the Unified Interface

The following flow chart at Fig.Ā 2 describes the interface of users to setup his/her profile and the environment. Where users supposed to have an account in order to use the system, and for the first time the system requested him/her to enter their details which will be used for the user profile as shown in Figs.Ā 3, 4, 5 and 6. Once the user enters his/her details she/he can login in the system and use it according to default environment setup that will be done according to group he/she belongs to. In addition to the dynamic interface setup, security has been considered in the interface to allow authenticated user to use the program and connect to the network as shown in Figs.Ā 7 and 8, where user can be authenticated through 3 different methods. User will be authenticated using RFID and optional security keyword, normal login with password, and Voice recognition (Figs.Ā 4 and 5).

Fig.Ā 2.
figure 2

The control flow chart of user interaction

Fig.Ā 3.
figure 3

Defining user basic user group window

Fig.Ā 4.
figure 4

Adjusting font size options

Fig.Ā 5.
figure 5

Adjusting speaker level

Fig.Ā 6.
figure 6

User login via RFID detection and keyword

Fig.Ā 7.
figure 7

The unified universal interface main control window

Fig.Ā 8.
figure 8

User profiler module window

3.3 Main Unified Interface

The main user interface is shown in Fig.Ā 7, where user can customize his/her user profile as well as environment setup as shown in Figs.Ā 8 and 9. In window 8, (Fig.Ā 8), the common interface is to service on the two applications (Smart Help and Smart Editor) which identifies the user and the group of that user and the setting needed by that group to be able to use the applications.

Fig.Ā 9.
figure 9

Environmental setup module window

Common Interface Components:

User Profiler.

Contains user information like (name, phone, age ā€¦) and the PWD group that user belong to and if the user will use the default setting to its group or not. Through this module window 8 (Fig.Ā 8) we can do these options:

  • Add a new user profile

  • Edit current user profile

Environment Setup.

Contains the setting that help the PWD group to use the application like (Input volume level, output volume level, enable input through mike, enable output through speakers, font size, and font color) as shown in Window 9 Fig.Ā 9. For each group there will be a default setting and user can customize one for him, but at any time if the user needs to return to default setting can do this throw the button Restore Defaults in the Environment setup Window.

Smart Editor.

Used to open Smart Editor application in another window and still running and send the user id to smart editor application to get all information and setting about that user from database throw it.

Smart Help.

Used to open Smart Help application in another window and still running and send the user id to smart editor application to get all information and setting about that user from database throw it which has been covered by Kbar & Aly in [15]

Help.

Used to open a document describe how to use the common window and its function. The user can read this document or listen to it according to its need.

Close.

Used to close the common window itself.

4 Considerations for the Designed SMARTUNIVERS

In order to optimize our universal interface design, several considerations and standards for the design of user interface has been taken into account [16]. Among these significant considerations are the usability:

  • Usability testing with real PWD users. Giving typical users some tasks to perform and recording what they do and what they think of the resource.

  • Usability evaluations of SMARTUNIVERS by experts. They might make use of formal guidelines, checklists or questions (e.g. ā€˜usability inspectionsā€™ or ā€˜heuristic evaluationā€™)

  • Gathering PWD user feedback.Ā These approaches involve seeking feedback from users after theyā€™ve used the resource.

  • Usage logging.Ā A lot of useful information is recorded automatically by the server or software used to deliver your resource.

Pertaining to the above usability requirements, our design is based on flexible dynamic interface that adapts to different impairment conditions that would satisfy the needs of PWD users. In addition, extra consideration of usability has been addressed through expert in disability. We also use the login method to track the performance of the interface and optimize the design to satisfy PWD requirement conditions, where different logging statements will be recorded to track the performance of the system as well as collecting automatic feedback from users to further analyze them and improve the interface.

On the other hand, we considered also, ISO 13407 which focuses on the processes involved in developing a high-quality and usable interface. It advocates four main steps:

  • Specify the context of use.Ā Understand who will be using the resource and how they will be using it.

  • Specify user, organizational requirements, and the tasks that must be supported.

  • Produce design solutions to meet the requirements identified in Step 2.

  • Evaluate designs against user requirements.Ā Check that the development does in fact meet the requirements and targets you have identified in earlier steps.

Responding to the above requirements, we have considered the four guidelines for high quality user interface as specified by the ISO 13407. Where the whole design interface has been done according to the need of PWDs to satisfy their requirement conditions at the workplace as has been described in the next previous section. We are planning to involve PWD users to evaluate the interface and give us their feedback, in addition to involve them in testing the interface once a prototype has been built. This will allow us to improve the design to satisfy PWD needs. Note that our design support 11 groups with different impairment conditions with adaptable unified interface that adjust the interface environment setup according to user group conditions as well as the working environment conditions associated with floor location, weather and day conditions.

A most significant issue that we considered also, is ensuring accessibility of the developed SMARTUNIVERS. TheĀ Disability Discrimination ActĀ gave certain rights to people with disabilities in the areas of employment, housing, and access to goods or services, but it excluded education.

Taking accessibility requirement, actually, our design is considering accessibility for PWD at the work environment through considering both desktop and smart phones. The interface will be dynamically setup according to user and group profile. In addition, PWD users can customize the environment setup according to his/her preference. Two main programs will be supported to maximize the accessibility of users at the work environment which are Smart help that allows users to search for relevant information at the workplace such as location, building information, and employeeā€™s information. In addition PWD users will be able to communicate with other users and care giver as well as setting up personal note and reminder to remind him/her about future activities. The Smart help and Smart editor can be driven by users through voice control, mouse, and keyboard.

5 Conclusion

The proposed universal interface paves a great road toward inclusion of various groups of disables persons with different combinations of impairments. We have presented the SMARTUNIVERS, which is a smart universal interface that suits eleven groups of persons of various kinds of disabilities. It provides two main smart solutions. The smart help (SMARTHELP) and smart editor (SMARTEDIT). We have focused only on the interaction and control of the SMARTUNIVERS. The SMARTUNIVERS will make use of voice and speech recognition engines, text to speech, virtual mouse/keyboard and Braille keyboard to cover the requirements of wide spectra of PWD defined groups. The proposed solution will smartly identify the PWD user profile and adopt the interface display parameters (color, fonts, etc.) and environments (speaker volume, Mic volume, virtual mouse, and virtual keyboard) according to the requirements of the PWD based on the predefined and stored profile. Finally, the SMARTUNIVERS satisfy the usability, accessibility requirements of the defined PWDs groups.