Keywords

1 Introduction

In a Virtual Reality system, users are immersed in a synthetic environment and surrounded by virtual objects. Traditional human computer interfaces such as mouse and keyboard are no longer useful. Researchers have been actively investigating new ways of interaction and communication between man and machine. Tangible User Interface, which allows users to interface with the virtual world using physical objects in order to bridge the gap between real and virtual world, shows great potential in this area.

This paper presents the research of inventing a novel and natural spatial human-system interface – MagicPad HD that is to be used in a fully immersive virtual environment. In order to realize the overall design and study the concept of the spatial human-system interface, an immersive Virtual Reality system imseCAVE [1], which serves as the test bed of the MagicPad user interface is used. MagicPad HD is designed and implemented on top of the previous innovations of MagicPad interfaces, which are MagicPad Light [2] and MagicPad AR [3]. MagicPad Light creates the magic of projecting images on a piece of paper and it is the first attempt to realize the idea of using “pen and paper” metaphor as a 3D user interface tool. Lightness and the interaction with the infra-red pen are its major advantages but the limited resolution constraints its application. MagicPad AR, being a handheld window to the virtual world, has successfully demonstrated its intuitiveness and capability to thousands of users in different exhibitions [4]. By combining the MagicPad Light and MagicPad AR, MagicPad HD uses a high resolution tablet and capacitive pen to bring the best of both designs. The high resolution tablet compensates the limited resolution of projected images normally found in most VR systems using projectors and the capacitive pen provides a flexible tool for writing on the tablet and interacting with 3D virtual objects simultaneously. User study indicates that MagicPad HD user interface is highly effective and intuitive. It also provides strong evidence to the effectiveness of use familiar objects as the tangible devices as the 3D User Interface in an immersive virtual reality system.

2 Background

2.1 Virtual Reality System

The Cave Automatic Virtual Environment (CAVE) is one of the most famous VR systems and is originally developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago. It produces 3-dimensional stereo effect by displaying in alternating succession the left and right eye views of the scene as rendered from the viewer’s perspective [5]. These views are then seen by the viewers through a pair of LCD shutter glasses whose lenses open and close at high frequency in synchronization with the left and right eye views that are projected via cathode-ray projectors onto the translucent walls and floor acted as screens. In addition to 3D stereoscopic vision, motion parallax, the apparent displacement of objects when you move, e.g., objects closer to your eyes move faster than objects that are far away, is another very important cue to perceive objects in 3D space. In the immersive virtual reality system, it is at least, if not more, important than stereoscopic vision, especially for the 3D objects closed to the user. For example, if a user looks at a piece of virtual furniture in a CAVE, the user should be able to move around and examine the furniture from different angles. This system should be able to produce the image from the correct perspective corresponding to the relative position of the user and the virtual furniture. Because of this motion parallax cue, the user has the faked perception that the furniture exists at the same 3D space as the user. In order to achieve this motion parallax cue, the system has to know the position and orientation of the user’s head in real time and this information has to be provided by a motion tracking system. As a result, the system allows the user to be immersed into the virtual environment and interact with the virtual entities realistically.

Since the CAVE system provides a highly versatile platform for 3-D visualizing complex concepts and systems, it has been deployed to explore new statistical graphics applications [6], simulate complex molecular dynamics and interactions between atomic particles [7], virtual exploration and analysis of archaeological site [8], perform assembly planning [9], and for collaborative product design and development [10].

2.2 MagicPad User Interface

Virtual Reality opens a new door for us to explore the world of imagination. However, unlike the real world, we cannot touch virtual objects with our hands. Even with the latest haptic technology, it is nothing compared to, for instance, the tactile feeling of holding an apple in our hands. In order to interact with the virtual world, we need to develop a set of devices and methodology as user interfaces to the VR systems. The interface is called 3D or Spatial User Interface.

Inspired by the pen and paper, which are something that we use every day, the MagicPad spatial user interface has been designed, implemented and evaluated in this research. In fact, the original innovation of the MagicPad is aimed to provide the similar experience of using a pen and paper. With this familiar interface tools, we target to allow a layman user, even without prior experience in using VR interfaces, to be able to use the MagicPad right away without much training. As a result, two generations of MagicPad User Interface have been developed, each with their own unique design and characteristic. They are MagicPad Light [2, 11,12,13] and MagicPad AR [3].

2.2.1 MagicPad Light

The MagicPad Light interface mainly consists of one or more flat surfaces, which can be a white cardboard or sketch book pages. The white surface (MagicPad) acts as a display to receive the image from the ceiling mounted projector (Fig. 1). The spatial location (position and orientation) of the MagicPad are tracked by an optical tracking system. When the MagicPad moves, the image projected by the projector will be updated to match the location and motion of the MagicPad. This results in the illusion that the image is glued onto the MagicPad’s surface. In addition, the user can use an infrared pen, where position is also being tracked, to interact with the MagicPad. Since the MagicPad and the infrared pen are very light-weight devices, the user can easily hold them with his/her hands and perform series of 3D interactions with the virtual environment.

Fig. 1.
figure 1

The design concept and hardware configuration of the MagicPad Light Interface

As the image projected on the MagicPad is generated in real time, the use of the MagaicPad can be very diverse. For example, it can be used as a 3D painting tool to paint freely in a 3D space, or it can be used as a tangible tool that exists in both the virtual and real world to interact or examine the 3D virtual environment. However, the image quality of the MagicPad Light is limited by the resolution and brightness of projector. In additional, the workable area is constrained by the coverage of projection image.

2.2.2 MagicPad AR

Given the limitation of MagicPad Light, the design of MagicPad AR focuses on the image quantity and the coverage. Similar to MagicPad Light, the MagicPad is tracked by an optical tracking system with a number of infrared cameras. Unlike MagicPad Light, the infrared pen is removed and the paper pad is replaced by a tablet device. By using the tablet instead of the projector, the image quantity in terms of resolution, brightness and contrast, is significantly improved. Figure 2 shows a typical setting of MagicPad AR, the tablet acts as a window to the virtual world and the user is free to walk inside an area and explore the virtual world through the tablet.

Fig. 2.
figure 2

The design concept of the MagicPad AR

3 MagicPad HD

Upon the successful implementation and positive response of MagicPad AR, the idea of a further enhanced version has been proposed. In fact, the overall concept of MagicPad is to use familiar tools (pen and paper) as the tangible user interface in order to shorten the learning curve of new users, and at the same time, it provides an effective medium to interact with virtual objects in a VR system.

3.1 Design Concept

Although MagicPad AR is proved to be an intuitive user interface, the lack of a pen-like interface limits its ability to interact with users as compared to the MagicPad Light. The design of the MagicPad HD interface combines the best of the MagicPad Light and MagicPad AR. It uses a high resolution tablet as the MagicPad and a custom made pen like device as the MagicPen. The MagicPen can write on both the tablet and in the 3D space. According the experiment of MagicPad Light, bigger MagicPad size provides more usable area and bigger window to the virtual world. However, a bigger size tablet comes with increased weight and we need to get a balance between size and weight.

3.2 System Implementation

MagicPad HD is built on top of the imseCAVE and therefore it also shares the benefit of the effective tracking system and immersive virtual environment. Figure 3 shows the design of MagicPad HD. A SONY XperiaTM Tablet Z is chosen to be the MagicPad. The tablet weights only 495 g and it is one of the lightest tablets available in the market. The size of the screen is 10.1 in. with HD resolution (1920 × 1200). Three retro-reflective markers are attached at the corners for motion tracking. As the user can only use one hand to hold the MagicPad, a metal ring is installed at the back of the tablet and it helps the user to grasp it firmly, as shown in Fig. 4. For the MagicPen, it has a capacitive pen tip so that it can be used to write on the tablet with capacitive touchscreen directly. The thin strip of aluminum foil on the pen tube is used to conduct electric current from the user’s finger to the tip of the MagicPen. A custom made tracker with 4 retro-reflective markers is extended from the pen tube and is used for tracking the MagicPen’s position and orientation. The MagicPen is wirelessly connected to the system by a USB wireless receiver which plugged in the main computer so that the user can trigger different functions by pressing the two middle red buttons.

Fig. 3.
figure 3

MagicPad and MagicPen in MagicPad HD interface (Color figure online)

Fig. 4.
figure 4

Holding the MagicPad with one hand with the aid of the ring

Similar to MagicPad AR, the processing capability of the tablet is not powerful enough to render the virtual scene efficiently. Therefore, the images displayed on the MagicPad are generated by an additional workstation and streamed to the tablet wirelessly.

In actual operation, the MagicPad HD has two modes: Paint mode and Camera mode (Fig. 5). In the Paint mode, the user can use the MagicPen to write on the MagicPad, as shown in Fig. 6. After writing, the user can swipe the lower right “Detach” button and the system will create a virtual paper with the drawing. The screen of MagicPad will then be cleared out for another drawing. As the virtual paper is floating in the air, the user can use the MagicPen to grab and relocate the virtual paper. This is done by pressing the Button 1 (Fig. 3) and intercepting the virtual paper with the MagicPen simultaneously. In addition, the MagicPen can be used for painting freely in 3D space by pressing the second red button, as shown in Fig. 7. In the “Camera” mode, the MagicPad acts as a window to the virtual world. It displays the same content in the virtual environment from the user’s perspective but with much higher resolution than the image from projectors of the VR system, as shown in Fig. 8. This is a very useful compensation to the limited resolution of projected image, especially in some applications that require close examination of virtual objects. Similar to detaching a drawing, the user can also swipe the lower right “Capture” button to create a virtual photo.

Fig. 5.
figure 5

Paint mode (left) and Camera mode (right) of the MagicPad HD

Fig. 6.
figure 6

Paint mode: the user writes on MagicPad and detaches a virtual paper

Fig. 7.
figure 7

Paint mode: the user draws freely in 3D space (Color figure online)

Fig. 8.
figure 8

Camera mode: higher resolution image displayed on the MagicPad

3.3 User Study

In order to test the usability of the MagicPad HD, an experiment has been designed and a corresponding user study has been carried out.

3.3.1 Participants

82 participants were recruited via referral by colleagues and friends. Most of them are university students and some outsiders with various age groups. All of them have the experience of using computer.

3.3.2 Experimental Procedures

There are two experiments: Real and Virtual Task Comparison Test, and Virtual Task Test. Real and Virtual Task Comparison Test consist of performing similar tasks in both real and virtual environment. The Virtual Task Test consists of tasks being performed only in the imseCAVE environment. The environment settings for both virtual and real tasks are similar, as illustrated in Fig. 9.

Fig. 9.
figure 9

Environment setting for real (left) and virtual (right) tasks

In the Real and Virtual Task Comparison Test (Fig. 10), the participant has to perform a task in both the real and virtual environments. In the real environment, the participant writes a letter on a memo pad and sticks it on the specific position on the doors of the cabinet. Similarly, the participant is then requested to perform a similar task in the virtual environment with the MagicPad HD interface. In order to get the participant accustomed to the virtual environment, the user is requested to spend a couple of minutes walking around the virtual environment before performing the virtual task. The time taken for both real and virtual tasks is recorded.

Fig. 10.
figure 10

Performing letter writing task in both real (top images) and virtual environment (bottom images)

In the Virtual Task Test, the participant has to perform three different virtual tasks. They are Camera Tool, 3D Paint and Physics Game. In the Camera Tool task, the participant is first briefed about the usage of the Camera Tool, and the participant is then asked to read and follow the instruction (Fig. 11) in the virtual notepad on the table. As the prints on the virtual notepad are relatively small, the participant will need to take advantage of the higher resolution of the MagicPad in order to read the instruction. By following the instruction, the participant takes a photo (Fig. 8) and moves the virtual photo to the photo frame.

Fig. 11.
figure 11

Instruction for the Camera Tool task and participant placing the virtual photo on the photo frame

In the 3D Paint task, the experimenter briefs the participant the usage of MagicPad’s 3D Paint mode. After practicing, the participant is asked to draw five specific letters (A to E) onto 5 semi-transparent boxes, as shown in Fig. 12. The time taken by the participant in completing the task is recorded.

Fig. 12.
figure 12

Participant performing 3D Paint task

Finally, in the Physics Game task (Fig. 13), the experimenter triggers a series of virtual soccer balls drop from the ceiling. The participant is asked to create virtual papers and arrange the papers into a virtual track for guiding the virtual soccer balls that are dropped from the ceiling to fall into a virtual wooden bin. These soccer balls are automatically generated by the computer in the virtual environment and the experiment records the time taken for a participants to successfully guide the two consecutive soccer balls to the wooden bin from the starting time of the experiment.

Fig. 13.
figure 13

Physics Game task

4 Discussion and Conclusion

4.1 Evaluation

The results are summarized in Tables 1, 2 and Fig. 14.

Table 1. Result of average completion time for different tasks of 82 participants
Table 2. Results of questionnaire showing the average scores of all participants
Fig. 14.
figure 14

Line chart of task completion time vs participants for different task

In Table 1, the average completion time of the Real Drawing and the Virtual 2D Drawing are almost the same (38.78 s vs 38.83 s) and the difference is not statistically significant as revealed in the ANOVA analysis of variances (F 1,81 = 0.002 ns), which reflects the Virtual 2D Drawing is as effective as the Real Drawing. In fact, participants use even less time to complete the 3D Drawing task. For the comparison between 3D Drawing and Real Drawing, the mean completion time of 3D Drawing task is 19.6 s and the difference is statistically significant as revealed in an ANOVA analysis of variance (F 1,81 = 754.263, p < .05). This is mainly due to the fact that drawing directly in 3D space is more effective than 2D drawing because 3D drawing eliminates the need of paper repositioning. This is also part of the reason that participants consider MagicPad user interface provides additional capability of performing tasks than in real world environment (Question 2f in Table 2). In contrast, the result of Physics Game is more diverse, with average completion time of 57.5 s and a much larger standard deviation of 37.8 s. The participants also report that Physics Game task is relatively harder to complete than other tasks (Question 2c in Table 2). It is because this task is more complex and requires different techniques such as problem solving skill, visual spatial skill and hand-eye coordination. All these factors vary a lot between participants and cause the large differences in performance.

In Table 2, the lowest score (3.5, question 1a) we got in the questionnaire is about the similarity of the Real and Virtual Task Comparison Test. Although the procedure of the both tasks are similar, the participants do not consider these experiences are very alike. At this stage, it is clear that Virtual reality technology still cannot replicate the reality experience completely, especially in the area that requires the sense of touching such as the feeling of writing on a paper and sticking a piece of paper on the cabinet. For the Camera Tool, participants agree that high resolution of the tablet (MagicPad) can compensate the limited resolution of the projected images (Question 2a). This is particularly important if the virtual scene contains lots of detail information. For example, a training system requires the users to examine the detail structure of a machine. With limited resolution, the only way to make the close examination possible is to enlarge the object. However, it is not intuitive and does not reflect the real world situation. With high resolution of the MagicPad, the user has a very effective tool for exploration and investigation in a virtual environment. Moreover, the Camera Tool task is the easiest virtual task (Question 2a), which is aligned with the comments from the users for the MagicPad AR. As a matter of fact, Camera tool is essentially the same as the MagicPad AR discussed in the previous section. Overall speaking, participants have lot of fun in performing the tasks in the virtual environment (Question 3a). They feel great and excited in being able to interact with virtual objects in a synthetic environment when performing the task of Physics Game. Virtual Reality is an effective tool to arouse the interest of users. In addition, the participants agree that it is effective using familiar objects (in this case, pen and writing pad) as user interface tools to learn the new means of interactions (Question3b). This finding supports the original hypothesis of the MagicPad user interface. The Question 3c asked whether the participants feel comfortable to perform task in the virtual environment. The result (3.8, between normal (3) to comfortable) is actually better than expected. The major complaints are the feeling of dazzling and the weight of the tablet.

4.2 Conclusion

One idea, three implementations – the motivation of creating intuitive spatial user interface for immersive virtual environment resulted in three innovations of MagicPad user interfaces: MagicPad Light, MagicPad AR and the latest MagicPad HD. MagicPad Light creates the magic of projecting images on a piece of paper and it is the first attempt to realize the idea of using “pen and paper” metaphor as a 3D user interface tool. Lightness and the interaction with the infra-red pen are its major advantages but the limited resolution constraints its application. MagicPad AR, being a handheld window to the virtual world, has successfully demonstrated its intuitiveness and capability to thousands of users in different exhibitions. By combining the MagicPad Light and MagicPad AR, MagicPad HD uses a high resolution tablet and capacitive pen to bring the best of both designs. The high resolution tablet compensates the limited resolution of projected images normally found in most VR systems and the capacitive pen provides a flexible tool for writing on the tablet and interacting with 3D virtual objects simultaneously. User study proves that this 3D user interface is highly effective and intuitive.

This paper has documented the journey of the successful invention of the MagicPad spatial user interface. In the future, this research can go into two directions. The first direction is to further study the performance of the MagicPad interface in different application domains and different VR systems such as head mounted display system. The second direction is to further improve the performance of the MagicPad, such as the incorporation of new flexible display materials in order to make a lighter yet bigger or even foldable screen. Moreover, a realistic haptic feedback for the MagicPen will be incredibly useful, especially when it is used to interact with virtual objects. However, high fidelity haptic systems often come with a set of bulky mechanical or maglev system. Incorporating a useful haptic in a pen size scale remains a research challenge.