Keywords

1 Introduction

It is often said that we are living in the Information and Communication Technology (ICT) age, where complex information must be quickly and easily accessed. This is shown through the emergence of smartphones and how new and sophisticated electronics allow for smaller and more powerful technologies in daily use. These new technologies have the potential to be used by anyone irrespective of age, gender, location, nationality, disability or time considerations [1].

This paper discusses the Sensory Evaluation method which may be used to measure one aspect of a happy life based on the session theme: “New Well-Being Measures in HCI.”

2 Well-Being and User Experience

Recent discussions in the field of ergonomics have shifted from usability into User Experience (UX) [2]. The session topic of well-being may be close in meaning to this concept. UX can generally be considered a broader category under which usability of the user interface (UI) falls. UX depends on human perceptions and responses that result from the use or anticipated use of a system, product or service of before, during and after use. It includes user emotions, preferences, perceptions, physical and psychological responses behaviours and accomplishments. It is affected by prior experiences, attitudes, skills, personality and the context of use [3]. In this sense, well-being measures may be used as criteria to assess aspects of user experience.

Figure 1 explains the potential relationship between well-being and user experience. The ISO defines usability as the extent to which a system, product or service can be used by specified end-users to achieve specified goals with optimal effectiveness, efficiency, and satisfaction in a specified context of use [3]. Accessibility is the usability of a product, service, environment or facility by people with the widest range of capabilities [4]. Universal design as well as inclusive design consists of usability and accessibility. User experience is a person’s perceptions and responses that result from the use and/or anticipated use of a product, system or service [5]. In summery accessibility includes physical factors like reliability and functionality of use; usability is cognitive and perceptual factors, like usefulness and effectiveness of use; and UX includes emotional factors [6] like attractiveness and comfort of use. Hence well-being involves all of these expressions of use; functional, reliable, effective, useful, comfortable and attractive. Sensory Analysis, the method discussed in this paper, focuses on the attractive, comfort, and useful aspects of human emotion in evaluation of user experience.

Fig. 1.
figure 1

Potential relationship between well-being and UX

3 Human-Centred Design

Product design in industry must start from an initial perception of user needs. However at the present, original design resources tend to be derived from proprietary technologies. For example, the smartphone is an all-purpose machine with a lot of features and functions; however, only a limited number of users will be able to use these functions thoroughly. This is because the experimental and manufacturing development stages tend to be based on the predetermined target specification solely created and measured by experts. Then the products are refined by Value Engineering (VE) for the cost factor [7] and shipped into the market, which may be the first opportunity that actual users have to examine and determine the usability of the machine. Therefore, needed feature requests and vital feedback from end users are only available to the designers after the machine has been introduced into the real market.

Human-centred design (HCD) is especially relevant to the usability, accessibility and user experience of products. HCD became an international standard as ISO 9241, Ergonomic requirements for office work with visual display terminals (VDTs), part 210: 2010 (former ISO 13407) [5]. Nowadays many manufactures have applied it to the development processes bringing innovative concepts to production plans and designs to gain up-to-date feedback from end users for gathering requirements earlier in the design process. HCD is based on the context of use and now standardized as ISO 9241 Part 11: 1998, - Guidance on usability. The context of use is combination of specified users, tasks, resources, environment and goal as an intended outcome [3].

4 Sensory Evaluation Method

In this study, the Sensory Evaluation (SE) method was applied to examine human emotional and perceptible attributes for products [6]. This research was started in order to measure the context of universal communication through local sign languages by applying the correspondence analysis (CA) of multivariate analysis (MVA) in SPSS [8]. Sign languages are originally designed for use by hearing-impaired people, and they include semantic expressions in their scope.

In this project, an original method to create pictograms based on seven multiplex local sign languages, Japanese (JSL), American (ASL), British (BSL), French (FSL), Spanish (ESL), Korean (KSL), and Chinese (CSL), using the HCD concept of context of use on dialogue, and by applying MVA [9, 10], is discussed.

In this paper, “thank you” in several multi-national sign languages will be presented and discussed as an application example. The overall research is initially focused on the creation of pictograms or icons to support dialogues, since the fundamentals of sign language are hand shape, location and motion. References are made to a collection of animation figures, extracted from seven local sign languages used by a deaf architect. This architect provided enthusiastic support for this research by supplying and permitting these references to be added to the database [11].

To evaluate this approach, the SE method was applied through following three steps. The first step was to measure the similarity of a selected word “thank you” among seven different local sign languages using MVA (Fig. 2). In the experiments, the participants were first shown an expression with the collection of animation figures extracted from seven local sign languages. Subsequently, the participants were informed of the meaning of the sign, and then they were requested to vote with 19 tokens to express which of the seven different local sign language expressions (samples) best coincided with the original image. They were asked to use all 19 tokens, but they were permitted eventually to use zero voting on some samples (Figs. 3 and 4). The first experimental participants were 13 students in their twenties. Some had experience living overseas as well as sign language interpreting.

Fig. 2.
figure 2

Sign figures for “thank you.”

Fig. 3.
figure 3

Inquiry sheet example for “thank you”

Fig. 4.
figure 4

A view of the experiment and a participant

For the analysis, correspondence analysis (CA) of MVA in IBM SPSS Statistics Ver. 18 [8] was applied. The outcome was plotted on a plane such that similar local sign languages were plotted close together (Fig. 5). The outcome of CA of MVA indicates fundamentally no dimension in Eigenvalue axes. Because of the characteristics of CA, the participants who have general and standard ideas are positioned near the centre, whereas those who have extreme or specialized ideas are positioned away from the centre. The centre crossing point (0.0) of the first and second Eigenvalues is also called “centre of the gravity” or “average”. In this way, CA generates a graphical examination of the relationships between local sign languages and participants [12].

Fig. 5.
figure 5

A plot for the “thank you” results with seven sign languages

Figure 5 is a plot diagram for the “thank you” results and represents the relations between the seven sign languages (samples) and participants. Notably, there were two split sample groups. One group consisted of the elements; BSL, ESL, FSL and ASL which are western sign languages. The other group consisted of the elements; JSL, KSL and CSL, which are Asian sign languages.

The second step was to help a pictogram designer to create a new common pictogram by exploiting and summarizing expressions resulting from the MVA analysis conducted in the first step. Figure 6 is a newly created pictogram referring to only the BSL and FSL sign languages.

Fig. 6.
figure 6

A newly created pictogram: “thank you” that combines BSL and FSL

5 Results and Consideration

The final step was to validate the newly created pictogram with MVA. The outcome, including the newly designed pictogram, was plotted with the other seven local sign languages in order to measure whether the newly created pictogram was representative of the dominant cluster. The second experiment participants were 20 engineering department students in their twenties including two female students. Almost all except three were different participants from the first experiment. After voting with 23 tokens this time, all the participants were again asked to rate their confidence level using the Semantic Differential (SD) method [9, 10].

Figure 7 is an example of an outcome chart where the new and old “thank you” sign result set is plotted. The newly designed symbol has a coloured green flag and represented the FSL and BSL sign languages because they were plotted close to those two sign languages. As the figure showed ASL and ESL positioned closer to the green flag, whereas JSL, KSL, and CSL were plotted further down. As with the first step, two split sample groups emerged.

Fig. 7.
figure 7

A supplementary treatment plot of the new “thank you” showing 7 + 1 sign languages

6 Application for Smartphones

The resulting pictograms/icons were implemented as an application on smartphones that have touch screens (Fig. 8). The final system was evaluated by hearing-impaired participants and foreigners, to compare qualitative and quantitative measures of effectiveness, efficiency, and satisfaction based on context of use (Fig. 9). This evaluation focused particularly on efficiency by comparing two task groups. Initial results suggest that applying tapping pictograms/icons on the smartphone is about five times quicker than text message by e-mail.

Fig. 8.
figure 8

Pictograms on a touchscreen-based smartphone

Fig. 9.
figure 9

The evaluation experimental setup

7 Conclusions and Future Work

This work shows how the Sensory Evaluation (SE) method can easily make relative comparisons between the seven expressions of local sign languages. It is more effective than the Ordering Method or Pair Comparison Method [12], because of the characteristics of applied CA of MVA in SPSS, the participants who have common and/or standard ideas are positioned near the centre, whereas those who have extreme or specialized ideas are positioned away from the centre. In this way CA establishes a method to graphically examine the relationship between local sign languages and the preferences of the participants.

Through the SE method, the relationships between selected words and local sign languages were initially explained by a sensory evaluation of the participants. Although this paper discussed the SE method as used for the analysis of multiplex local sign languages, it can be extensively applied to the other systems, products or services as well, where future work will explore. For instance, Customer Satisfaction Index (CSI) [13] may be another approach to measure some aspect of well-being. Future research issue could explore how create complementary representations using SE and CSI together.