Abstract
We propose a tactile expression mechanism that can make physical contact and provide direction indications. We previously proposed a wearable robot that can provide physical contact for elderly support in outdoor situations. In our current scheme, wearable message robots, which we mounted on the user’s upper arm, give such messages to users as navigational information, for example. Using physical contact can improve relationships between users and robots. However, our previous prototypes have a problem because the types of tactile expressions (that the robots can make) are limited. Thus, we propose a tactile expression mechanism using a pneumatic actuator array for wearable robots. Our proposed system consists of four pneumatic actuators and creates such haptic stimuli as direction indications as well as stroking a user’s arm. Our wearable robots were originally designed as appropriate support and communication for two types of physical contact: notification and affection. Our proposed mechanism for physical contact and direction indications naturally extends not only notification but also the affection abilities of the robot. Our robots and our proposed mechanism are expected to support the mobility of senior citizens by reducing their anxiety on outings.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
The world’s aging societies continue to face a variety of serious problems related to lifestyle changes, such as an increase in nuclear families. In these societies, the elderly and the disabled who are living alone sometimes need the support of caregivers to overcome their anxiety even when they just want to leave their homes for a walk or go shopping. However, due to a shortage of caregivers and volunteers and the emotional burden of seeking help, such people often withdraw from society. Such withdrawal from society causes even greater problems, including advanced dementia.
During outings, the elderly and the disabled face two main problems: physical problems, especially for seniors, caused from impaired body functions, and cognitive problems, which can exacerbate memory loss and attention deficit issues. Such problems can result in serious accidents. The use of power suits [14] solves some of these physical problems by compensating for partial power loss. On the other hand, cognitive problems can cause memory loss and attention deficit issues. For people with dementia, the instructions or the purpose of their outings might be forgotten, which can be risky [11] during outings.
We focus on cognitive problems and solve them with robots. We are researching wearable message robots that can care for and support such people during their outings [18, 21]. Since our robots are wearable, they can provide support to users anytime and anywhere. In addition, since physical contact is intuitive and critical for communication, our robots have a mechanism for making two types of physical contact: notification and affection. Notifications are robot behaviors made when the robot wants to tell the user something, such as patting the user’s arm before saying something. Affections are behaviors that show the robot’s internal states, such as embracing the user’s arm. We believe that both behaviors are important for achieving natural communication and good relationships between users and robots [23].
Considering support in outside situations, indicating directions is also essential, for example, when the robot wants to convey the idea of “Please look over there.” Even though pointing by the robot’s arm is a natural way to indicate directions, the DoFs of our wearable robot arms are insufficient for such behaviors. Unfortunately, increasing the number of DoFs in the robot arm also increases its weight. Vocal notifications are another natural way for directional indications by robots. However, vocal information is sometimes difficult to hear and understand outside, especially in crowded situations. In addition, although louder voices simplify notification, personal messages might be embarrassing: “It’s time for a toilet break.” Thus, in this paper, we propose wearable robots that can make not only physical contact but also give direction indications.
2 Related Research
Various researches exist on haptic stimuli as displays for mobile situations. Research has investigated vibration stimuli as feedback on the touchscreens of mobile devices [5]. Directional indicators have also been discussed using vibro-tactile devices [4, 6], gyro moments [1, 20], and a combination of skin stretches and vibro-tactile stimuli [2, 8]. These researches physically notify users of information.
Considerable researches have also been conducted on anthropomorphic behaviors, such as affection, the attention of robots and agents [9], and wearable haptic interfaces [3, 16]. The effectiveness of anthropomorphic expressions using pointing, facing, and the gazing of the robots and agents has been confirmed in various experiments [9, 24, 25]. Their multimodal behaviors are effective; however, the behaviors have been discussed without including physical contact by robots.
Communication robots or agents as media have also been developed based on the premise of ongoing communication between people [12, 13]. Other schemes feature a wearable avatar robot on the shoulder [7]. A mobile-phone type robot was also proposed [10].
We have been researching support for elderly outings, especially toilet problems, such as a toilet map acquisition system [19] and a toilet timing suggestion system [15]. To achieve such an elderly support system based on such researches, appropriate mechanisms for transmitting the support information are also important. For such purposes, we proposed a wearable message robot that combines haptic stimuli and a robot’s anthropomorphic behaviors to enable feeling actual physical contact from the robot [21, 23] and a simplified system [18].
In this paper, we propose a mechanism that can make not only physical contact but also provide direction indications for effective elderly support in outdoor situations.
3 Previous Prototypes of Our Wearable Robots with Physical Contact
First, we introduce the previous prototypes of our wearable robots [18, 21].
3.1 First Prototype
Figure 1 shows the first prototypes of our wearable message robot [21]. As described above, both notification and affection are important for human-robot communication. This prototype system performs both behaviors.
Figure 2 shows the system configuration of the first prototype. The system consists of a stuffed-toy robot that includes sensors, actuators, and a fixing textile. It has two degrees of freedom (DoFs) in its head and one in its left hand. A 3D-accelerometer with a 3D-compass detects the activities of both the user and robot, and there is a speaker inside the robot. In the fixing textile, a vibration motor is attached for haptic stimuli. We placed an antenna of capacitance (as used in a theremin) on the lower part of the fixing strap to measure the thickness of the user’s clothing and to adjust the strength of the haptic actuations.
The system’s fixed parts weigh about 350 g, including the stuffed-toy robot, the actuators, and the battery. But since this prototype requires a small PC (400 g), its total weight is about 800 g.
By simultaneously combining the motions of the robot and the haptic stimuli, our proposed system provides users with a feeling of physical contact from the robot. To express a notification, the robot repeatedly pats the user’s arm, while a short-term vibration simultaneously creates haptic stimuli to express the physical contact of the robot’s touch. This behavior is seen during a caregiver’s initial contact with a patient. To express affection, the robot turns its face toward the user, and a simultaneous pressure stimulus relays the physical contact of the robot’s hugging behavior.
3.2 Second Prototype
Next, we introduce the second prototype of our wearable message robot [18]. This simplified version of our first prototype solved the previous weight and robustness problems. The first prototype included several sensors and actuators and can realize various behaviors and tactile expressions. On the other hand, considering the actual use cases for seniors or patients with dementia, the detailed system’s robustness is insufficient. Furthermore, since it is too heavy to wear for everyday use, we designed a simplified configuration of our message robot to achieve greater robustness and lighter weight.
Figures 3 and 4 show the appearance and configuration of the second prototype. In the simplified system, we employ smartphones proactively. Since the latest smartphones are generally equipped with a triaxial accelerometer and a compass, we employed these sensors to estimate the user’s situation and activities. Since smartphones are also equipped with a global positioning system (GPS), we are investigating whether the location and velocity information obtained from the GPS can be exploited to estimate the user’s context. The robot includes a vibration motor for tactile presentation and a speaker for auditory presentation. These actuators are controlled by a small board PC (Raspberry PI). The board PC and smartphone are connected through Wifi or Bluetooth. A pocket is included on the fixation strap for storing the smartphone. The robot’s weight (including a battery) is about 250 g and it is about 18 cm tall. Thus, the entire system’s weight including the robot and the smartphone is about 350–400 g (most smartphones weigh less than 150 g).
3.3 Problems of Previous Prototypes
As described in this section, we proposed two prototypes of wearable robots that can make physical contact. However, our previous prototypes have a problem because the types of physical contact that the robots can make are limited. Table 1 shows the tactile expressions of the previous prototypes. The first can make two expressions (one in notification and another in affection), and the second can only make one expression. Thus, in this paper, we propose a tactile expression mechanism that can make various physical contact expressions for wearable robots.
4 Tactile Expression Mechanism Using Pneumatic Actuator Array
Next, we propose and describe our proposed tactile expression mechanism using a pneumatic actuator array for wearable robots.
Figure 5 shows its appearance with the proposed mechanism. Figure 6 is the proposed mechanism’s configuration for tactile expressions. The proposed mechanism is used as a fixed part of the robot. In the following description, we use the actuator numbers shown in Fig. 6.
Four actuators are arranged around the user’s arm. By shortening a portion of the actuators, various directions can be indicated. We employ pneumatic actuators (SQUSE PM-10RF) that are shortened by increasing their internal pressure. The overall system configuration for direction indications is shown in Fig. 7. The pneumatic actuators require a compressor (SQUSE ACP-100) and a pressure control unit (PCM-200).
Our proposed mechanism is designed for making both a notification and showing affection. We designed two types of notification and three types of affection, as shown in Table 2.
Notification Expressions: As with the notification expressions, we design expressions for indicating directions and drawing the attention of users. Although indicating directions is not a physical contact, direction indications are essential during such outdoor support situations as navigation. Thus, the main purpose of the proposed mechanism is realizing direction indications. The basic idea for indicating directions was proposed in the literature [17].
Figure 8 shows examples of motion designs for indicating four directions. In these designs, two adjacent actuators are shortened simultaneously, and pulling sensations are generated in the same direction of the shortened actuators. For example, in Fig. 8(a), the system indicates the left direction (toward the robot) by shortening actuators 2 and 3. In addition, we can make diagonal directions by activating just one actuator.
Unfortunately, we have not experimentally evaluated the effectiveness of our motion designs for our proposed mechanism yet. Instead, several people have used it from whom we obtained comments and feedback. Some recognized the direction indications, but others could not. These results suggest the need for further improvements of such actuation ways as strength or timing. On the other hand, although the current implement remains relatively unsophisticated, some did recognize the indications. Thus, our system’s basic mechanism for indicating directions can achieve directional indications.
In addition, as physical contacts for notification expressions, we design expressions for drawing the attention of users. These robot behaviors grab the user’s arm by its left/right hand and are implemented by shortening the actuator (2 or 3) located at the robot’s hand. We aim to realize similar expressions for the patting behaviors of the first prototype.
Affection Expressions: As with the affection expressions, we designed embracing, stroking, and clinging expressions as shown in Table 2. The details of the affection expressions are described in the literature [22].
5 Conclusion
In this paper, we first introduced two prototypes of our wearable message robots that snuggle up to the user’s upper arm and transmit messages to users after making physical contact. We expect our robot to reduce the anxiety of the elderly during outings and support their participation in such events.
We also proposed a tactile expression mechanism that can make physical contact and provide direction indications for our wearable robots. Our proposed system consists of four pneumatic actuators and creates not only physical contact but also provides direction indications.
Future work will experimentally evaluate our system’s effectiveness and investigate such detailed motion designs of actuators as different combinations of shortened actuators and actuation timing and strength. Appropriate behaviors will be investigated to combine direction indication and robot behaviors. We are also investigating an integrated support system for seniors that consists of our method, a toilet map acquisition system [19], and a toilet timing suggestion system [15].
References
Amemiya, T., Sugiyama, H.: Haptic handheld wayfinder with pseudo-attraction force for pedestrians with visual impairments. ASSETS 2009, 107–114 (2009)
Bark, K., Wheeler, J., Premakumar, S., Cutkosky, M.: Comparison of skin stretch and vibrotactile stimulation for feedback of proprioceptive information. In: Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 71–78 (2008)
Bonanni, L., Vaucelle, C., Lieberman, J., Zuckerman, O.: TapTap: a haptic wearable for asynchronous distributed touch therapy. In: CHI 2006 Extended Abstracts, pp. 580–585 (2006)
Cassinelli, A., Reynolds, C., Ishikawa, M.: Augmenting spatial awareness with haptic radar. In: International Symposium on Wearable Computers (ISWC 2006), pp. 61–64 (2006)
Fukumoto, M., Sugimura, T.: Active click: tactile feedback for touch panels. In: CHI 2001 Extended Abstracts, pp. 121–122 (2001)
Kajimoto, H.: Electrotactile display with real-time impedance feedback using pulse width modulation. IEEE Trans. Haptics 5(2), 184–188 (2012)
Kashiwabara, T., Osawa, H., Shinozawa, K., Imai, M.: TEROOS: a wearable avatar to enhance joint activities. In: CHI 2012, pp. 2001–2004 (2012)
Kojima, Y., Hashimoto, Y., Fukushima, S., Kajimoto, H.: Pull-navi: a novel tactile navigation interface by pulling the ears. In: ACM SIGGRAPH 2009 Emerging Technologies (2009)
Kozima, H.: Infanoid: a babybot that explores the social environment. In: Socially Intelligent Agents: Creating Relationships with Computers and Robots, pp. 157–164 (2002)
Minato, T., Sumioka, H., Nishio, S., Ishiguro, H.: Studying the influence of handheld robotic media on social communications. In: Social Robotic Telepresence in ROMAN 2012 Workshop, pp. 15–16 (2012)
Rowe, M.A., Feinglass, N.G., Wiss, M.E.: Persons with dementia who become lost in the community: a case study, current research, and recommendations. Mayo Clin. Proc. 79(11), 1417–1422 (2004)
Saadatian, E., Samani, H., Toudeshki, A., Nakatsu, R.: Technologically mediated intimate communication: an overview and future directions. In: Anacleto, J.C., Clua, E.W.G., Silva, F.S.C., Fels, S., Yang, H.S. (eds.) ICEC 2013. LNCS, vol. 8215, pp. 93–104. Springer, Heidelberg (2013). doi:10.1007/978-3-642-41106-9_11
Sekiguchi, D., Inami, M., Tachi, S.: RobotPHONE: RUI for interpersonal communication. In: CHI 2001 Extended Abstracts, pp. 277–278 (2001)
Tanaka, T., Satoh, Y., Kaneko, S., Suzuki, Y., Sakamoto, N., Seki, S.: Smart suit: soft power suit with semi-active assist mechanism-prototype for supporting waist and knee joint. ICCAS 2008, 2002–2005 (2008)
Tsuji, A., Yonezawa, T., Yamazoe, H., Abe, S., Kuwahara, N., Morimoto, K.: Proposal and evaluation of the toilet timing suggestion method for the elderly. Int. J. Adv. Comput. Sci. Appl. 5(10), 140–145 (2014)
Wang, R., Quek, F., Tatar, D., Teh, J., Cheok, A.: Keep in touch: channel, expectation and experience. In: CHI 2012, pp. 139–148 (2012)
Yamazoe, H., Yonezawa, T.: Direction indication mechanism by tugging on user’s clothing for a wearable message robot. In: ICAT-EGVE 2015 (2015)
Yamazoe, H., Yonezawa, T.: Simplification of wearable message robot with physical contact for elderly’s outing support. In: Proceedings of the 2nd International Conference on Human-Agent Interaction (HAI 2014), pp. 35–38 (2014)
Yamazoe, H., Yonezawa, T., Abe, S.: Automatic acquisition of a toilet map using a wearable camera. In: Joint 7th International Conference on Soft Computing and Intelligent Systems and 15th International Symposium on Advanced Intelligent Systems (2014)
Yano, H., Yoshie, M., Iwata, H.: Development of a non-grounded haptic interface using the gyro effect. In: HAPTICS 2003, pp. 32–39 (2003)
Yonezawa, T., Yamazoe, H.: Wearable partner agent with anthropomorphic physical contact with awareness of clothing and posture. In: The 18th International Symposium on Wearable Computers (ISWC 2013), pp. 77–80 (2013)
Yonezawa, T., Yamazoe, H.: Haptic interaction design for physical contact between a wearable robot and the user. In: HCII 2017 (2017 to appear)
Yonezawa, T., Yamazoe, H., Abe, S.: Physical contact using haptic and gestural expressions for ubiquitous partner robot. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), pp. 5680–5685 (2013)
Yonezawa, T., Yamazoe, H., Utsumi, A., Abe, S.: Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the ICMI 2007, pp. 140–145 (2007)
Yoshikawa, Y., Shinozawa, K., Ishiguro, H., Hagita, N., Miyamoto, T.: The effects of responsive eye movement and blinking behavior in a communication robot. In: Proceedings of the IROS 2006, pp. 4564–4569 (2006)
Acknowledgements
This research was supported in part by JSPS KAKENHI 15H01698 and 25730114.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Yamazoe, H., Yonezawa, T. (2017). A Tactile Expression Mechanism Using Pneumatic Actuator Array for Notification from Wearable Robots. In: Duffy, V. (eds) Digital Human Modeling. Applications in Health, Safety, Ergonomics, and Risk Management: Ergonomics and Design. DHM 2017. Lecture Notes in Computer Science(), vol 10286. Springer, Cham. https://doi.org/10.1007/978-3-319-58463-8_39
Download citation
DOI: https://doi.org/10.1007/978-3-319-58463-8_39
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58462-1
Online ISBN: 978-3-319-58463-8
eBook Packages: Computer ScienceComputer Science (R0)