Keywords

1 Introduction

Several researches in VR environments have been developed in the past decade in different areas and scenarios. Some solutions already present the usage of bicycles with Head Mounted Display (HMD) in which virtual scenarios are visualized as background for athletes training or as cardiac patient rehabilitation systems [1]. In the case of user’s body perception, there are still few in development, leaving those focused on the perception only of the hands by the users [2, 3]. However, assessments on the interactivity and appropriation of the user’s real body itself with the bicycle are still rare or unknown.

In this paper we present a haptic interface of a real bicycle using HMDs as Mixed Reality (MR) display and programmable controller boards with Arduino to capture user interaction with the bicycle. This system can be defined as a “proxy object” [4, 5], where the rotational movement from the handlebar is possible within the virtual environment, allowing a closer approximation of reality and better communication capacity between users’ hands. We introduce the concept of real hands, forearms and legs appearance, capturing images from the HMD camera and showing real interfaces for the user.

To measure and evaluate the MR interface, two experiments had been conducted: one that simulates a virtual reality (VR) mode and the other that corresponds to a MR simulation mode, where the user’s real body and the front part of the real bicycle are visualized inside the virtual environment. We conducted experiments with seven gamers users. Some of them don’t usually ride a bicycle. However, six from the seven participants described that immersion is augmented as well as the feeling of presence in the MR simulation environment, feeling a better experience with the improvement of movements (speed, pedals and rotation of the handlebars).

On Sect. 2 of this paper we discuss related works that influenced and lead to the development of this experiment. After that, on Sect. 3 is shown the methodology used to develop the interactive bicycle and virtual environment. Following Sect. 4 describes the design technique chosen to guide the experiment and the results obtained. Finally, on Sect. 5, the conclusion achieved based on the results is presented.

2 Related Works

2.1 Presence in Mixed and Virtual Reality

The ability to permeate between virtuality and the real world is one of the main characteristics of Mixed Reality [6]. Currently, the usage of MR is the most varied, from tools to development of collaborative projects [5, 7] that allow teams to interact with telepresence projects, educational projects [8,9,10], medical fields [11], games [12] or “pervasive games” [13, 14].

A new MR paradigm that has emerged recently and still is not well understood by both industry and academia is the “pervasive virtuality” [4, 15,16,17], which comprises a MR environment that is constructed and enriched using real-world information sources that causes extremely intense and immersive feeling. It is also known that to make the user experience immersive, with environments generated through sensorial elements, usually visual, auditory and tactile stimuli and also by the continuous tracking of the surrounding environment it is crucial to track and switch between real and virtual world, maintaining their correct positioning during interaction. [5, 18] present several variables that influences on the feeling of presence, such as: user’s head rotation, path curvature, scaling of translational movements, and scaling of objects (and/or the entire environment). Some works about situations experienced by the user and the position of the objects in a way that has connection with the context and interaction through the perception or manipulation of their own hands had been developed [4, 19,20,21], but the concern on measuring presence in virtual environments are still rare. [22] defines presence as a normal awareness phenomenon that requires directed attention and is based in the interaction between sensory stimulation, environmental factors that encourage involvement and enable immersion, in addition to internal tendencies to become involved but few had taken those questions through a real virtual environment evaluation [23], in order to mitigate the effects of interruption to measure presence. Most of the works continues on measuring presence in the formal way as presence questionnaires or interviews after the experience on the virtual environment [24].

2.2 Procedural City Generation

Procedural city generation can be achieved by multiple means, and the basic idea is to apply procedural content generation (PCG) algorithms [25] to generate a whole city at the push of a button [26]. As an example, L-systems have been successfully used to generate realistic cities with complex street networks and a very high number of buildings [27]. Furthermore, shape grammars have been utilized to generate realistic buildings [28] like in our procedural city scenario. A newer development in PCG is Maxim Gumin’s WaveFunctionCollapse (WFC) algorithm that was first published in 2016 [29, 30] and became very popular on the internet in a short time frame with many developers running experiments and reimplementing the source code for other environments like the Unity game engine. It has been successfully used before in the wild for different interesting scenarios like ours [31]. Infinitown is a procedurally generated city [32] that is related to our idea of an endless city, but our version does not use a finite grid of random city blocks but WFC to achieve variation and generation of city blocks. The present work uses methods of procedural city generation for creating endless environments.

3 Methodology

3.1 Implementation

The bicycle apparatus consists of a standard commercial bicycle, a stand to hold the bicycle stable, an Arduino board connected to the sensors, a Hall sensor and magnets attached to the rear rims used to measure the RPM of the rear wheel of the bicycle and a 10 KΩ potentiometer connected through 3D printed gears to the handlebars allowing to capture the turning angle of the bicycle (see Fig. 1).

Fig. 1.
figure 1

Bicycle system with the sensors. A: Hall sensor and magnets. B: Arduino board attached to a case. C: Potentiometer and 3d printed gears

The MR build is composed of an HTC Vive HMD, the HTC Vive camera, a Leap Motion sensor mounted on the front of the HMD using a 3D-printed frame and a Chroma key background. The project is composed of two experiments, one that uses the Leap Motion to track the user hands position to interact with a virtual bicycle while the user real hand touches the real bicycle, serving as a haptic interface; and other that uses the HMD camera and the Chroma key to create the MR environment where the user sees his own body and the real bicycle.

The Chroma key effect was achieved using a custom Chroma key shader created communicate with a Stencil shader. This set allowed the Chroma key background to be placed only in front of the bicycle in the room. If the participant look towards the handle to see his hands or the bicycle the Chroma key shader would come in effect removing the green background and if the participant look on others directions, for ex.: up or to the sides, the Stencil shader erase the camera image allowing the visualization of the real bicycle and hands overlayed on the virtual environment without an extensive Chroma key background. A scheme with the main controllers can be seen in Fig. 2. The software was developed in Unity3D (2017.1.0f3) using Leap Motion Orion (3.2.1) and SteamVR (1.2.3) plugins and using the arduino sensors data as input on the application. Our experiment was running on a Windows PC with an Intel i7 3, 10 GHz, 8 GB RAM and a Nvidia GTX 1050.

Fig. 2.
figure 2

Scheme showing the controllers responsible for each key element of the experiment

3.2 Scenario

As a background scenario, we use a procedural virtual city [26], which we call “Endless City”. We use Maxim Gumin’s WaveFunctionCollapse (WFC) algorithm [30], an example-driven image generation algorithm (but applying it to 3D Unity game objects rather than colored pixels using Joseph Parker’s WFC version [34]) to bootstrap the procedural generation and achieve variation by generating city blocks and streets on a 20 by 20 grid which we call a city superblock [33]. Each element of the grid can either be occupied by a building or a street and has a size of 30 Unity units (meters). Each building has a single sidewalk around it to make the city more plausible. The city is procedurally generated on the fly and continues quasi-endless in all directions including the ground.

The seeds from the random number generator are stored for already visited areas (city superblocks) and restored if the user visits that area again. Only a three by three grid of city superblocks are procedurally generated and held in memory simultaneously. The bicycle is always located in the center superblock, and if, for example it moves into the superblock located to the north, three new superblocks are generated (to the north of the already existing ones), and the redundant three superblocks (to the south) are removed from memory. The same is true for the other compass directions. In the space between the superblocks there is always a street present to make it possible to connect them easily (Fig. 3).

Fig. 3.
figure 3

Endless City as seen from above consisting of a three by three grid of superblocks. The bicycle, represented as the red dot, is always located in the center superblock and the city extends in all directions quasi-infinitely. (Color figure online)

Each building texture is generated using a tiny software library (TinyCgaShape) written by one of the authors of this paper in Unity C# and based on CGA shape, a grammar for the procedural modeling of computer graphics architecture [28, 35]. Although the buildings are not yet photorealistic, we plan to add more realistic building textures using Unity assets or procedural methods (e.g., using Perlin noise [36] to create a tiling brick texture) as future work. We were aiming to create a small sized Unity asset (which is clearly possible using PCG): TinyCgaShape is just 5-kilobytes in size and the code for generating a building including sidewalks is just 17-kilobytes, and finally, the WFC code (for the generation part) is just 24-kilobytes. The buildings are made of cubes that can vary in height to give them the look of skyscrapers and the building textures are fully parametric (they can vary in the number of floors, floor height, window width and height, and building color).

3.3 Participants, Stimuli, Tasks and Measures

Seven gamers were invited to participate on the experiments. Our sample was drawn from students of our University, all male, between 22 and 38 years performed the two experiments. After signing the consent, image and voice forms, each participant was asked to go up on the real bicycle and received the explanation about the devices and the procedure of the study. After setting up the HMD and being familiarized with the virtual handlebar, the participant started the first experiment. On the first task, the participant was asked to drive through the cones circuit inside “Endless City”. On the second task, the participant only needed to free pedal to any part of the city like a simulation tour. In the second experiment, the user was asked to repeat both tasks (see Fig. 4).

Fig. 4.
figure 4

User’s body, handlebars and scenario on the two experiments. A: 3d avatar mode and B: real user’s body in the virtual environment.

Through thinking aloud method the users were encouraged to describe their feelings on immersion and controls of the system answering questions about focus, immersion and interaction, that could later be analysed to measure the degree of presence (see Table 1). All the experiments were recorded through video and photographs inside the laboratory of the University.

Table 1. Questions made by the facilitators to the participants.

4 Results and Data Analyses

The initial step was to listen, see and take notes on the video recordings of all feedback. Even in a still exploratory way and as “simulation stage” [4], the physical environment (city) equipped with the infrastructure allowed the creation of a mixed reality environment and with it was possible to analyse the degree of presence, through control, focus, immersion and involvement factors (see Table 2).

Table 2. Participants answers divided by experiments and presence factors.

All users were feeling well before performing the tests and were able to complete the tasks without any immediate physical or dexterity difficulties even though five of the seven participants didn’t use a bicycle frequently. However, it was noticed that the two participants who were used to ride bicycles had problems with balance in the VR experiment, while they felt no problems with our proposed MR solution.

All seven participants felt immersed in the first experiment (VR), but one mentioned that the virtual hand was not like his real hands. Also, three of the seven participants felt a difference between the actual handlebar angle to the virtual handlebar angle. However, none of the participants had difficulty with the sensors or changes made in the actual bicycle to allow the experiments.

It is known by now, that rendering the user’s body in VR increases presence and consequently immersion and involvement, which enable the illusion of “being there” in various situations, whether in games, sports training, medical rehabilitation or even in the form of simulations and exploration of the environment. What this study could demonstrate is that not only rendering, but the real images of the user’s body and the “proxy object” could transpose the feeling of immersion for a better experience, as six participants felt more comfortable and immersed in the MR experiment. One fact that could have influenced is that a direct tracking was not applied to the legs of the user. In VR their leg position was estimated by the speed captured from the wheel, while in MR they could see their real legs without the need of programming or tracking, having an increased feeling of body presence in MR.

About realism factors, it can be said that all participants found the first experiment (VR) convincing, although two have mentioned changes that could improve the immersion in the procedural scenario, as applying a texture on the floor and reduce the sizes of buildings and sidewalks, for example. Either way, on both experiments, all participants felt so immersed that they felt losing track of time within the VR and MR during the whole test, which took about 10 to 15 min in total.

5 Conclusions

Even with only seven participants, this study could demonstrate that not only rendering, but seen your own body and the “proxy object” in a mixed reality haptic interface could transpose the feeling of immersion for a better experience. It was able to comprove the illusion of “being there” by rendering user’s real body in MR. For all the participants it can be said that they found the first experiment (VR) convincing, but it was clear by the users answers and reactions that the MR experiment had a greater degree of presence. The comparison of experiments by the users showed capability of seeing your real body in the MR experiment allowed them to have a better handling and balance of the haptic interface being tested, some of the users linked that improvement to the fact that they were able to see their real body.

With the think aloud method it was possible to analyse the degree of presence, through control, focus, immersion and involvement factors, though an integrated VR/MR questionnaire inside the virtual environment could also have been developed to achieve quantitative results. This probably will be one of the future works that the researchers aim to improve. This can demonstrate a still open world ahead to be taken, not only to measure presence and evaluate immersion and involvement in mixed reality systems, but also different applications, for instance changing scenarios and adding tasks to enable possibilities of future works in this new pervasive virtuality mixed reality paradigm with a real and not so expensive nor rare object as a bicycle.