1 Introduction

As more and more smart products are being introduced into our daily lives, users are expected to build better connections with them. One of these products is the smart bike. Smart bikes can collect data which helps organizing biking in a more efficient way. However, users call for a more connective interaction to create empathy with the smart bike, instead of just scanning numbers on a screen. This requirement creates an opportunity for adaptive interaction that responds to the user’s habits and environment.

After researching how a human gains emotional intelligence and forms his or her personality, a new interaction framework named machine personality is proposed. In this framework, the smart product will try to adjust its interaction mode to adapt to the user’s environment and habits. It collects data from the user and their environment to form its own personality gradually. What’s more, this product will not only react to the user but also to other smart products that have machine personality for creating a more connective interaction.

This framework will be applied to the interaction product UMA–P on a smart bike. It will collect data from user’s biking habits using an accelerometer and the noise of environment using a sound sensor. Based on this data it will form its own machine personality with three different actions, and perform one action with different LEDs as a facial expression each time. The user can give feedback on actions under a certain machine personality by touch sensor. UMA-P will try to perform those actions which users have previously given a high score. What’s more, it will search for other UMA-P’s in the surrounding area and respond to them with different LED animation.

After going through a pilot test with users, this product is expected to give users a more interesting and connective biking experience that encourages them to use the smart bike more often.

2 Background

2.1 Robot Expressions for Empathy

One of the workable solutions for social robots to create empathy is partly mimicking human behaviors in an exaggerating way to evoke a playful experience. In this way user can create empathy with social robots without encountering the uncanny valley phenomenon [6]. Keepon [3] and Leonardo [9] showed early prototypes of this mindset with abstract motions and facial expressions that had inner connections with human behaviors. NAO [7] and Jibo [2] have become successful commercial products by providing meaningful responses to human commands in a playful and cute way. These examples show that by abstracting human behavior patterns and performing these as metaphors, social robots can achieve a connective interaction with humans.

2.2 Emotion Intelligence

The definition of emotion intelligence shows it’s a response to emotional information. Emotion is considered as the reactions to the external environment, such as threat, cooperation, play, etc. Different reaction signals will bring about different emotional information. The way in which humans respond is known as personality [5]. Neurologists found that emotional information processing is based on reinforcement learning [4]. These findings provide us inspiration of reconstructing emotion on the bike.

2.3 Creating Playful Adaptive Interaction

There are a several related works illustrating adaptation to user habits or environment. Situated Apparel shows how to visualize information with LEDs in order to enhance affective communication. What’s interesting was that it could show lights in line with the environment the user had been in [8]. Another work from Kim Baraka and Manuela Veloso gave us an example of how to model different users according to their interaction habits with the robot. By analyzing dynamic user preferences, the robot could divide different users into three groups and present different interactions [1]. These works also integrated the concept of affective computing, which were inspirational to our work.

3 Machine Personality

3.1 Framework

As mentioned above, social robots have been well developed in providing metaphors of human behavior for creating connective interactions. However, when it comes to smart products, different contexts might require new methodologies. The use of some products, for example smart bikes, might be affected by the outdoor environment and by user’s habits. Furthermore, social interaction between smart products themselves is not thoroughly considered in traditional frameworks.

To fill these gaps, we developed a new framework called machine personality based on the knowledge of emotional intelligence and forming of human personality. We have a definition of machine personality as following:

The smart product forms its own interaction mode and machine personality based on user’s habits and environmental variables during use. This mode will be applied to human-machine interaction and machine-machine interaction with a few actions that meets the user’s preference and collects feedback for improving the user’s experience.

Here are some explanations of these concepts within the framework:

Action.

A concrete interaction under a certain machine personality. One machine personality can have several corresponding actions that have reasonable inner connection. For example if the machine personality is ‘irritated’ then all the actions will perform red LEDs. Different actions will provide a slightly different red color (See Fig. 3).

Environmental Variables.

The natural variables in the environment such as light, humidity, temperature and noise. By relating environmental variables to the interaction system, the user will better understand the context of machine personality. In our prototype we used sound sensors for judging the noise of the biking environment.

User Habit.

The user’s preference of using the functional part of the system. This functional part can be related to the interaction but can also be unrelated. For example, the frequency of braking is one user habit of the smart bike.

Companion.

Other smart products that have machine personality. More concretely, interaction products that have machine personality will seek whether there are other machine personality products nearby to determine their interaction. We used Wifi modules with specifically coded SSID for this purpose (Fig. 4).

3.2 Workflow

The interaction system that integrates machine personality will have three parts – the input system, processor and the output system. The system works as following: (Fig. 5)

  1. 1.

    As the system starts, it will collect data about environmental variables and user habits, which influence the forming of machine personality.

  2. 2.

    The system will pick up a machine personality to be performed later by judging the historical average data of environmental variables and user habits.

  3. 3.

    Next it will seek for companions nearby with the companion sensor.

  4. 4.

    After that an action under the certain machine personality will be selected and performed according to historical data about user feedback and companion numbers.

  5. 5.

    Finally user feedback for this action will be collected and the next loop starts.

3.3 Innovations

The innovations of this framework are as following:

  1. 1.

    This framework contains human machine interaction and machine to machine interaction.

  2. 2.

    It takes environmental variables into consideration.

  3. 3.

    It utilizes different contexts when compared to the traditional interaction framework for social robots.

  4. 4.

    It doesn’t rely on internet connection for a social interaction.

4 Prototype

The concept of machine personality was integrated in our prototype, UMA-P. Based on the framework of machine personality, the prototype works as following:

  1. 1.

    When the user starts biking, the system will collect the environmental variables using a sound sensor and user habits using an accelerometer. It will add this data to the historical data to calculate a new average.

  2. 2.

    The system will compare this new average with the preset threshold and select one of the four machine personalities.

  3. 3.

    Companion sensor, the Wifi, will scan for the companions in the area. The Wifi will get all the SSIDs of devices nearby and counts their specially enciphered SSID.

  4. 4.

    If there is at least one companion around, the system will start interaction between bikes with red LEDs and a heart shaped matrix that represents shyness. If there is no companion around, the system will start actions under a particular machine personality.

  5. 5.

    Each action under a machine personality has a weight according to the historical feedback. The weight will determine the action’s possibility of being picked up and performed. The action of UMA-P contains LED animations, color switches and matrix eye expressions. For example, actions under ‘irritated’ will have red colored and frowning eyes, while ‘calm’ has white light and plain looking eyes (Figs. 1 and 2).

    Fig. 1.
    figure 1

    A figure caption is always placed below the illustration. Short captions are centered, while long ones are justified. The macro button chooses the correct format automatically. (Color figure online)

    Fig. 2.
    figure 2

    A figure caption is always placed below the illustration. Short captions are centered, while long ones are justified. The macro button chooses the correct format automatically. (Color figure online)

    Fig. 3.
    figure 3

    Each machine personality can have a few actions which are concrete light animations. (Color figure online)

    Fig. 4.
    figure 4

    Relationship between environment, biking pattern and interaction style, ‘personality’, selection in UMA-P.

    Fig. 5.
    figure 5

    The work flow of machine personality on UMA-P.

  6. 6.

    Before the user gives any feedback the system will continuously perform the same action. Users can use the touch sensor on the handle to express his or her appreciation of the action. The longer he or she touches, the higher the action scores. This score will add to the average score of the action and contribute to the new weight. After the user gives their opinion, the current action will be replaced by another action weighted randomly and picked under the current machine personality, which is the process of Q-Learning. Then another loop starts.

With this prototype the framework of machine personality is clearly performed and contributes to a new biking experience (Fig. 6).

Fig. 6.
figure 6

UMA-P in use.

5 User Test

For evaluating the usability and user experience of UMA-P we design a user test conducted in the university. Users are first introduced how UMA-P works according to their biking condition. Then they are asked to set up UMA-P on their smart bike and cycle around the university for half an hour. After that we will have an interview with them and give them a questionnaire to quantify the performance of UMA-P. Users can rate their answers to the questions from 1 (negative feeling) to 7 (positive feeling). The questions are divided in the following categories:

  1. 1.

    The industrial design: We ask users 6 questions about the accessibility of setting up UMA-P (Q1–Q2) and if the design fits their understanding of a smart biking experience (Q3–Q6).

  2. 2.

    The interaction design: We ask users 16 questions about the light animation. Some questions are about if they understand the light interaction in the ring in the eye (Q10–Q12), (Q13–Q16), in the color (Q17–Q19) and in the companion interaction (Q20–Q22). Some other questions are about their acceptance (Q8–Q9) in the light animation on the bike and if it distracts them (Q17–Q19).

  3. 3.

    The biking experience: We ask users 5 questions about their acceptance of seeing smart bike as their pet (Q23–Q26) and if it will encourage more biking (Q27).

In our pilot test, four smart bike users (2 male and 2 female, age 20, 22, 21, 23) were invited to user UMA-P and give their feedbacks. The quantitative results are shown in Fig. 14. Overall users rated 5.1 UMA-P, which was higher than the neutral point (4). This means our prototype was attractive to them in a certain extend. Their opinions on the interaction, however, had a huge difference in light animations. The function of interacting with companions (Q20–Q22), which rated 5.5, was most appealing to them. It showed that users highly valued the fascinating experience of connective interaction with other users. The LED ring interaction (Q13–Q16) rated lowest (4.375) which meant the ring interaction was not initiative in the biking context. For encouraging more biking (Q27), users rated quite high score (5.75) for us. It meant UMA-P did motivate them to have more biking with the smart bike.

In the qualitative data we observed noticed that users were quite amazed by the eye interaction. They thought the eyes made UMA-P more vivid for them to create empathy with. Here are some comments during the interview:

‘It’s an appealing interaction, especially the eye expression on the bike! Additionally the appearance and colors are corresponding to the smart bike. I can see it bring an intriguing biking to me in the near future.’ (Female User Lu)

‘The light, as fashionable as the bike, is attractive for young people because it’s changing during biking. What surprised me most is UMA-P integrated machine learning and has its own personality, which is surely a new concept. Also it can help me interact with other companions. Feels good! I will definitely use it when it’s on the market!’ (Male User Xu)

These feedbacks provided an interesting perspective of what users like UMA-P most. They appreciated the fashionable and dynamic light interaction on the bike. They also made use of the information provided by the light animations. The pilot user test also provided us design insights that we need to create a more humanoid interface for users to feel playful during their biking. The element of minimalist design surface, such as LED rings, will not contribute much to the playful aspect. We guessed it was because a simple interface will not lead users to create empathy, just like it is hard to imagine a round stone can speak since it doesn’t have a mouth. Creating a ‘face’ metaphor on the smart bike will be more effective (Fig. 7).

Fig. 7.
figure 7

Quantitative result of questionnaires. A-type questions are in industrial design aspect, B-type questions are in interaction design aspect and C-type questions are in biking experience aspect.

6 Conclusion and Future Work

In this paper we introduced an interactive smart bike light UMA-P for a playful biking experience. With the metaphor of forming personality, UMA-P tries to perform useful biking information of environment and user habits. Also UMA-P is able to automatically response to companions nearby, which fits users’ need of social interaction in certain extend. Pilot user test showed users were willing to have more biking with UMA-P and suggested the next design should be more humanoid.

The limitation of the design is the integration on smart bike is not finished. Also the eye animations are not fully developed. We still need a co-creation session with users for better eye animations. Moreover the necessity of using a tangible user interface instead of a phone screen should be discussed since users didn’t appreciate LED rings much.

The next step is involving more users for a more quantified and persuasive user test. We want to apply persuasive technology into this product to motivate more use of the smart bike, which calls for more effort in the experiment design and user tests. What’s more, we are going to explore how to integrate other into other transportation.