1 Introduction

Web technologies have recently become widely used because they take advantage of cost reductions and the ability to support multiple users without space or time constraints. Interior design systems that use web technologies have emerged; Floorplan [1] is a web-based indoor interior design system that provides a 2D floor plan and 3D modeling results based on the 2D floor plan. Valkov et al. [2] implemented an immersive virtual reality (VR) space using a projector on the screen, and de Guimarães et al. [3] proposed a method to leverage web technology on mobile devices to arrange furniture. Recently, various head-mounted display (HMD) applications for VR that use smartphones have become popular, and WebVR [4] technology has been introduced to implement VR applications on the web. However, WebVR is still the early stage and interaction methods for WebVR are limited compared to traditional VR development environments. We propose an interactive VR system for interior design based on web technologies with a system that supports 3D and VR views via various interaction methods. The interaction devices are webized and connected to the system to handle interaction events through a web browser.

2 Our Method

We have designed a system architecture as shown in Fig. 1. The proposed system provides a 3D view to determine an interior layout and the stereoscopic VR view to look around the interior immersively. The content manager provides web content such as HTML documents, multimedia contents, and 3D model data to the webized content renderer on the web browser. The system deals with users’ interaction events based on the webized interaction method for human interface devices and events [5]. The webized interaction method provides a device-independent interface between VR applications and physical peripherals, and the interaction method uses an event negotiation mechanism to provide different abstraction types of user interaction events. When a user triggers an event using interaction devices, the webized device event server generates a message for the event and sends the event message to the webized device event library, which is written in JavaScript. The communication handler sends the user’s events to other users and receives events from other users. The webized content renderer then applies its own events and other’s events on the content. The renderer visualizes a final rendering result according to the rendering mode.

Fig. 1.
figure 1

Overview of the architecture

We define gesture-based user interaction events as shown in Table 1 to interact with the content in a HMD VR environment. In the VR view, a user uses hand gesture interactions to select menu items and place 3D furniture models, and the user can walk around the designed space using foot gesture interactions. The user can also clench and open his/her hand to open a menu. When the user touches a menu item with any finger, the item is selected. Pinch and drag interactions allow a user to move furniture around. The user can navigate the designed space by moving his/her center of gravity, which is carried out changing his/her foot position.

Table 1. User gestures and interaction events in HMD VR environment

3 Experimental Results

We present an example of an interior design using a prototype implementation to verify the usefulness of our approach. Figure 2 shows a 3D view of an example scenario through a desktop environment. In this scenario, a bride designs an interior of her marital house and shares this with the bridegroom. She uses a mouse to declare a virtual interior design space by moving boundary points on the 2D floor plan view as shown in Fig. 2(a) and selects a menu to arrange interior furniture models using the top view as shown in Fig. 2(b). When she chooses the 3D front view to view the immersive interior design, the system changes its view, as shown in Fig. 2(c). She can change the position of furniture in the interior space. She can share the designed space with the bridegroom by clicking on the e-mail button on the menu, as shown in Fig. 2(d), and the system then sends an e-mail to the bridegroom.

Fig. 2.
figure 2

Prototype examples of the desktop environment: (a) the 2D floor plan view, (b) the 3D top view, (c) the 3D front view, and (d) sharing the floor plan with others via e-mail.

To view the shared interior design as shown in Fig. 2(d), the bridegroom reads the e-mail from his bride and clicks on a link within it to view the interior design using a HMD VR environment. The prototype system uses the leap motion controller to recognize hand gestures and the Wii balance board to collect pressure position input according to foot gestures, as shown in Fig. 2(a). When he wears a HMD VR device on his head, he can see the interior design and his virtual hand to interact with the design, as shown in Fig. 3(b).

Fig. 3.
figure 3

Overview of HMD VR environment: (a) interaction devices for hand and foot gestures and (b) a screenshot of the VR interior design content with a hand gesture.

The bridegroom navigates the interior design with foot gestures and changes the space by hand gestures, as shown in Fig. 4. He can call up a menu widget and touch a button to place additional furniture, as shown in Fig. 4(a). He can touch an item on the list to choose furniture, as shown in Fig. 4(b). He can arrange selected furniture in the interior space, as shown in Fig. 4(c). After such the arrangement, he can change to a front view to navigate the designed space, as shown in Fig. 4(d).

Fig. 4.
figure 4

Prototype implementation of the proposed system in a HMD VR environment

4 Conclusion

In this paper, we proposed an interactive VR system for interior design based on web technologies. The proposed system provides 3D and immersive VR views with various types of interaction event through hand and foot gestures on the web. Web developers can use the proposed system to create VR applications easily and share applications by delivering a URL such as by sharing a webpage over a social network or instant messaging without any bothersome processes. In the future, we will conduct user studies to evaluate our proposed system, and we will then support real-time content and view synchronization to create collaborative work.