Keywords

1 Introduction

Mobile technologies such as handsets and wearables with cloud-connected apps connect people on an unprecedented scale. Chat environments, social networks, and real-time location tracking let people communicate an ever-increasing volume of status updates and messages. But, for all their benefits, current paradigms of digital communication lack the intuitive, natural feel of in-person interaction. The absence of many social cues we use to communicate face to face, including verbal tone, environmental context, body language, and touch, are missing. This paper presents a prototype, Remote Touch, that incorporates advanced vibrotactile feedback in a common social interaction, with the goal of enabling more natural interpersonal communication by addressing some of these shortcomings. Remote Touch creates the illusion of being touched and “pulled” by another person in your network. Multimodal design, haptic feedback, visual feedback, and gesture input are combined to create a compelling illusion of remote embodied presence through a mobile device.

The purpose of this prototype is to contribute to knowledge of the design capabilities of haptic interfaces in software for social experiences. Location-based technology is rapidly emerging and widely accessible. In recent years, 317 million people have had a wireless data subscription (CIA 2014), which can be used as a proxy for access to location sensing. The availability of data from these devices and improving ease of use for developers have allowed for the creation of playful social interactions that are more akin to a natural experience such as a “poke”, emoji, Bitmoji-style avatars, animated stickers, and GIFs. These are creative, valuable ways of replacing missing information that would otherwise be present in a face to face interaction. However, as technology progresses, these “workarounds” will become unnecessary, as the authenticity and information-rich qualities of interpersonal interaction will finally be able to be transmitted through digital networks. The line between digital and physical interactions in the real world is already blurring, and as this trend continues, people will have a more social, life-like experience when they communicate with others using digital tools. This project aims to contribute an example of playful social technology, with the hope that it inspires others to create new designs utilizing multimodal design and simple mechanics to help people connect with one another.

2 Background

Haptics are not new, but are often overlooked when interactive systems are designed. Even if the designers have not thought of haptics at all in the design process, if a system is interactive, it’s haptic – the question is only how much so, and whether the haptic experience is a good one. For an extreme example, take voice-driven interfaces. One could argue that haptics are not necessary in such an interface. We would argue instead that haptics have been intentionally excluded from the design, and that this decision creates both design opportunities (the ability to control it regardless of the body state of the user) and limitations (the lack of ability to feel the system’s responses when tactile sensations would otherwise be the appropriate result of a query).

Several factors contribute to haptics being the “forgotten modality”. Haptic design tools and rendering engines are less mature than those for visual and audio feedback. Audio and video streams can very closely simulate the real-world sensations of their content. The sensation of seeing a picture of an apple is very similar to seeing an apple in front of you – at least, much more similar than the tactile sensation of a vibration motor imitating sandpaper and the feel of real sandpaper.

For this reason, haptic designers are sometimes asked, “when can haptics do more than vibration?” The answer is, “when you combine haptic vibration intelligently with other modalities.” Even the relatively crude vibration displays found in most of today’s mobile devices are severely underutilized. The potential to create useful and elegant tactile experiences is already here – it only requires an understanding of how haptics can be combined with visuals, audio, and gesture to tap in to people’s preexisting understanding of embodied communication. In other words, it requires good haptic design.

This is already well known in games. While game development usually prioritizes visual rendering above other forms of stimulation (MacLean 2008), rumble feedback is an expected feature of the console experience. Many AAA titles incorporate haptic technology in their controllers, and while many indie developers are following suit, many still have not perfected the design.

Haptics have successfully been integrated into widely used technology in addition to games and entertainment, from smartphone keyboards to virtual reality devices. Most phone keyboards include tactile feedback when typing on the screen. Many people don’t notice because it feels natural considering the amount of device use in today’s life, but each key provides feedback when a key is successfully touched since the “click” of the keys isn’t feasible on a glass screen. Virtual reality devices are adding Immersion’s TouchSense Force to provide haptics to their controllers, where this platform offers much more than gaming. VR devices are being used widely by companies to provide immersive training to employees from welding (Porter et al. 2006) to operating rooms (Seymour et al. 2002). Adding haptics to these devices allows the user to feel more like they typing, welding, or operating, simulating a more life-like experience. Despite the adoption of haptics in these popular technologies, social media is still lagging in integrating it into interactions and interfaces to provide more immersive social encounters.

Most emerging technologies are eventually applied to social interaction – one such example is the transition from Web 1.0 to Web 2.0 (Weinschenk 2009). Sociological research indicates there are various reasons humans socialize, including improving cognitive function, producing feelings of happiness, and reducing stress (Billings and Moos 1981). Apps like Tinder, Facebook, LinkedIn, and multiplayer games are popular because they help people interact with particular social groups. However, if subtle body language cues, gestures, and tactile interactions between people could be included in these experiences, they would likely become even more intuitive. The act of feeling something allows for an interaction that elicits emotions and mental states that are sometimes hard to define otherwise or through other senses (El Saddik 2007).

Today’s most common use case for haptics on phones, smartwatches, and game controllers is notifying a user that new information has been made available on the device. However, the vibration itself almost never offers meaningful information in itself (MacLean 2008). When haptics do convey more meaning, it’s often in the form of patterns that people must memorize in order to understand, such as Google’s vibration patterns for turn-by-turn walking direction in Google Maps (Kobayashi and Nakamura 2016). Or for instance, these patterns provide the ability to differentiate a text notification from a Facebook notification based on the intensity or repetition of the vibration. We propose a new paradigm, where haptic design on mobile devices follows in the footsteps of haptics for games, where haptics is used to make an interaction more convincing and realistic – but instead of haptics conveying action as it does in games, in Remote Touch it conveys the gesture of another person.

The core gesture of Remote Touch is the common “beckoning” gesture, where one finger is pulled in to a hand indicating the direction of desired movement (McNeill 1992). While the gesture for “come here” varies significantly between cultures, the North American version is particularly amenable to touchscreen interaction, since it can be approximated with a single finger “flicking” a short distance across the touchscreen. When in proximity to each other, people might use a beckoning gesture to get someone’s attention and request they come closer; depending on the nuance of the gesture, it can also communicate that the other party is accepted or wanted on an emotional level, or that the request is urgent, reluctant, and so on. The haptic gesture in Remote Touch can also take on other meanings, such as a simple, friendly touch, akin to squeezing someone’s hand or poking them. The speed of the gesture as well as the social context, contributes rich social information about the intention of the sender.

Combining a mobile device, haptic effect design using Immersion’s TouchSense SDK, location information, and interface animations, our prototype creates a simple experience that connects people through non-verbal information to deliver a meaningful gesture and playful interaction. Instead of sending a text or emoji, this application allows for interactions as casual but socially rich as a wave or a high-five between users far away from each other. These types of applications will become more important as technology progresses and more families, coworkers, customers, clients, and loved ones are remote from one another and desire the feeling of true social connection.

Being remote from a connection where a line of communication is needed is all too common, whether it be a stranger or a close friend. Consider the following scenarios where a better social connection would benefit the interactions between the users: searching a busy street corner for an ordered taxi cab, or interpreting vague instructions in a remote team member’s email. Despite the differences in scenarios, in both situations more information is needed to reach an end goal in an efficient manner, and having the benefit of in-person feedback such as gestures or demonstrations in addition to speech would assist both users to complete the interaction. In both scenarios, a phone call is placed to clarify location or instructions, but this mode only affords verbal feedback. Having the benefit of seeing that person and their body language or location, whether it be through technology or in person, would enable a more effective experience.

3 Social Technology

Socializing is a common use of location-based experiences and mobile devices, as shown by the wide array of games and applications available to consumers such as StreetPass, Facebook, Tinder, and Yik Yak. The example applications listed all have the following in common: location and social interactions ranging from a virtual gesture to speech based in text form. Each application allows users to interact with each other when they become co-located, affording interactions that that are inherently interpersonal, but remove the human elements. They all include some form of social connection that take the form of virtual gestures or a metaphor for a verbal or non-verbal communication. They have all been developed to connect with the people around you, and inspire and incentivize communication in some form or another, but are lacking one of the important forms of feedback in human interactions – touch.

Multi-user games and applications are inherently social, but lack synchronous interactions which in-person socializing allows for. Many social applications that are used in tandem rather than in parallel with other users (Consalvo 2011). This is prevalent in “casual” and “social” games, where the social aspect is abstracted to the point where the interaction is no longer social. Many games incorporate leaderboards or invitations into their gameplay, where the player passively sees that other people are playing, competing with other player’s scores, or inviting their friends on social media to challenge their high score. At no point are players interacting with each other in gameplay, only in the game’s interfaces.

Non-game applications work the same way, where the mechanics used in many of these applications include passive multiplayer communication: StreetPass lets you “collect” passers-by in your area, Facebook lets you comment on your friends’ posts and send virtual “pokes” or “likes” or “waves” which they will read later, Tinder lets you collect matches that you may interact with later, and Yik Yak allows conversations between people in your vicinity which you can later comment on. The in-person interaction that is alike to this would be the equivalent of leaving a message on a sidewalk and seeing it after the writer had left. Although this passive multiplayer mechanic is convenient for mobile users who may need to interact at a later time, both the real-time social interaction and the wider context are lost through the technology itself. Designing games and applications in this way only provides a barrier in communication, and dehumanizes our social interactions in technology. Conversations in real time are ideally turn-based, but during that conversation we pick up on facial feedback and body language to determine how the conversation will unfold. A lot of information is lost in translation to words on a device, and many intentions would be more easily communicated via text with more social context. Our goal is to encourage less asynchronous spectating in applications and more real-time gestural interaction.

4 Remote Touch: A Prototype Combining Gesture, Haptics, and Location to Create the Illusion of Social Touch at a Distance

Remote Touch is a networked mobile application that lets two people interact through gesture, haptics, and location. The interaction serves as a metaphor for a commonly used social interaction where one user “beckons” the other in their direction. The application alludes to two users being tied together physically by a cord which can be tugged on to grab the other user’s attention. The application is meant to be utilized on a wearable device or smartphone, and within a larger framework as a supplement to social interactions within the application.

4.1 Purpose

In developing an application that incorporates haptics and gesture, we can prototype the translation of a social interaction that typically incorporates a few features that rely heavily on social cues - touch, direction, and gestures.

The purpose of the application is to provide a non-verbal social interaction that uses simple mechanics to emulate a meaningful gesture of intent of attention in a specific direction. Many social applications today do not provide effective forms of communication past superficial notifications and text (Chan et al. 2008), and by using abstract haptic effects, we hope to produce meaningful communication extending beyond these modalities. Figure 1 provides an illustration of how the application would be used with a smart watch.

Fig. 1.
figure 1

Smart watch interaction in Remote Touch.

4.2 Design

Remote Touch is a remote experience between two users on mobile devices. It is designed to use a user’s location, latitude and longitude, to provide a compass to the other user. The core mechanic of the application is a tugging gesture on the user’s interface that notifies the recipient of a gesture to get their attention, let them know they’re being thought of, or other social purpose. The interface, a ring seemingly attached by a cord to something off-screen, can be pulled away from the off-screen item and dragged around. This establishes the metaphor that there is a physical substrate connecting the two devices, and that what happens on one device can be felt on the other device because they’re part of the same physical structure. Figure 2 illustrates the flow of a user’s experience in Remote Touch.

Fig. 2.
figure 2

Experience flow between remote users.

Once two users connect to the application, the latitude and longitude is sent from one user to another by GPS. This allows the interface to point in the direction of the remote user by calculating the difference between the two sets of coordinates and then further calculating the angle between these differences. As User A tugs on their interface, the tugging action from User A is reciprocated to User B, in the form of an inverse tugging action toward the origin of User A. Along with the interface notification, haptic feedback allows the user to become aware that a remote user is “beckoning” them in their general direction. Adding these two modes of feedback together provides the illusion that a remote user is “tugging” you toward them, and the directionality of the compass gives context to the actions, conveying where the action is coming from.

Fig. 3.
figure 3

Screenshot of the interface of Remote Touch.

The current application has been developed for the Android operating system, utilizing the Unity3D game engine and TouchSense SDK for Unity.

5 Remote Touch: Future Development

Remote Touch is effective in providing a playful interaction as a prototype. In the future, such an interaction could be used to enhance social networks, chat apps, and location-based services. Users will be able to select a contact from their device and interact with that remote person to grab their attention. By adding a photo, the feeling of anonymity is removed, and puts a face to a remote, machine interaction. Simply showing a photo of the person on the other end provides a user’s presence to a machine, further humanizing the experience (Consalvo 2011). Figure 4 illustrates future directions for including contacts or social media connections within the application.

Fig. 4.
figure 4

Future contacts functionality for notifying users in Remote Touch.

Currently the application utilizes GPS location to determine location of the two remote parties, and display in which direction the remote party is located. Connecting the users with this functionality is more feasible for people in long range of each other with varied latitude and longitudes. Short range scenarios with GPS are less precise, where GPS provides accuracy ranging from 5–8 m in mobile devices (Zandbergen and Barbeau 2011). In order to connect people in buildings and short range areas, our goal is to integrate bluetooth low energy (BLE) iBeacon positioning. iBeacons afford the application the ability to triangulate a user’s position in indoor or outdoor situations, where positional accuracy is 1 to 4 m (Estimote 2015).

6 Conclusion

Many conversations are not conversations at all, but a grouping of gestures and visual feedback. Snapchat is a popular application that has identified this and designed their application to allow users to send photos of their face, short quips, and emojis, almost eliminating the need for text when sending a message to a friend. With this message, a lot of context is provided such as mood, facial emotion, and even location. But haptics are also an integral aspect to social interactions, and Remote Touch contributes something that many popular social media applications don’t, while providing a playful interaction for a common gesture. The application is well suited as a supplement to a larger social network whether it is as simple as a contact list in a user’s device, or a social media network. For an even more compelling experience, further social features are necessary to enhance a user’s presence such as mood, avatars, or emojis.

7 Discussion

Social media is a part of many adult’s daily lives, where 65% of American adults use at least one social networking site (Pew Research Center 2015). Many day-to-day tasks are being replaced by applications that have some social aspect like reviews, forums, or location. Ordering a taxi today has been replaced by Uber, deciding where to eat has been supplemented by Yelp, and many people have replaced shopping with making purchases on Amazon based on other people’s reviews. Out of these, Uber would benefit most from a short range location based interaction so a rider can more easily find their driver. But all of these applications would benefit from more context to a social interaction in addition to the text on the screen, and additionally would become more playful and fun to use. People’s whole lives are on their devices, from banking to dating, further providing incentive to create playful, lifelike interactions that mimic real life.

There are a variety of interactions that can be designed into our social networks and computer interactions from pokes to shoves, to pulls, tugs, brushes, taps, and rubs. The translation of these from human-to-human to human-computer interactions should be at the forefront of the experience design instead of simply designing a text notification and adding a haptic effect on top of it. Mobile devices afford designers a lot of information about a user – location, avatars, their likes, mood or current activities. Meaningful social design incorporates this information in tandem with playful tactile feedback, animations, sounds, or mechanics, and can bring a person to life through the screen.