1 Introduction

There is a demand for navigation systems that intuitively support users while walking. Conventional navigation systems such as Google maps [4] support the user using the steps illustrated in Fig. 1. Before the user walks, he or she sets a destination (S1) and selects a route suggested by the system (S2). While walking, the user checks their current position on the map (S3), and searches for landmarks in the real world (S4). If the user understands their current position in the real world, he or she can determine their travel direction (S5). Otherwise, the user laboriously repeats S3 and S4 until he or she correctly understands their current position. While repeating these steps, a user frequently moves their gaze and head to see the system and landmarks, as illustrated in Fig. 2. A user can feel a physical burden from these movements. In particular, existing methods [6, 8, 13] using global navigation satellite systems and digital maps on mobile terminals produce a physical burden because they require the user to repeat these movements.

In this paper, we tackle the challenging problem of decreasing the physical burden by reducing gaze and head movements using a novel navigation system. To reduce these movements, we must intuitively indicate the direction of travel to the user. Existing methods [2, 10] use simplified maps or simple messages to enhance the navigation system. However, these methods sometimes do not reduce head and gaze movements because the user must read the simplified maps or simple messages.

Here, we consider a situation where a guide suggests the travel direction by always walking in front of the user. In this situation, we believe that the gaze and head movements are reduced because the user simply follows the guide. When the attendees of a conference go to a banquet venue, for example, following a guide is the easiest way to smoothly reach the venue. We call this the following effect. This effect is known in the field of car navigation systems [9] and indoor navigation systems [14]. We show that the virtual guide provides the following effect when the user walks in an outdoor environment.

Fig. 1.
figure 1

Overview of a navigation system.

Fig. 2.
figure 2

Searching landmarks.

In this paper, we show that the following effect experimentally decreases the physical burden, by measuring the gaze and head movements of a walking user. Furthermore, we developed a navigation system that uses a virtual guide superimposed on the real world through a head-mounted display (HMD) and evaluated the reduction in movement. The rest of the paper is organized as follows. Section 2 shows that the physical burden of the user is reduced by the following effect when the user walks with a real guide. Section 3 shows the evaluation of a virtual guide with an HMD. Our concluding remarks are given in Sect. 4.

2 Following Effect Cased by a Real Guide

2.1 Comparison of Conventional Navigation Methods

To observe the following effect, we measured the movements of head poses and gaze directions while walking. We compared five navigation methods as illustrated in Fig. 3.

  • N1: (Real guide) The user followed a real guide who was always walking in front of the user.

  • N2: (Signpost) The user saw signposts set at junctions.

  • N3: (Mobile terminal) The user could check their current position on a mobile terminal at any time while walking.

  • N4: (Combination) The user used a combination of N2 and N3.

  • N5: (Map) The user carried only a real map printed on paper.

Fig. 3.
figure 3

Navigation methods that were compared to evaluate the following effect.

2.2 Evaluation Protocol of the Real Guide

In N1, a real guide, who fully understood the route and the map, walked about 2 m in front of the user. The real guide showed the travel direction at junctions by bending forward without talking to the user. The real guide walked at a constant speed and checked that the user was following behind at intervals of 30 s. In N2, signposts at junctions showed the travel direction using arrows. The user was able to see the signposts at a distance. In N3, the user carried a mobile terminal (SONY XPERIA mini) and could check their current position using a map application at any time during their journey. In N4, the user saw the signposts of N2 at junctions and carried the mobile terminal of N3. In N5, the user carried the printed map that dictated the route and destination. The user could look at the printed map at any time during their journey. Note that we provided the map to the user in all conditions from N1 to N5. We conducted the study for four participants (average age ± standard deviation of 22.0 ± 1.2, two male, two female) as the users. We prepared five routes (each with length of 600 m and having six junctions), as illustrated in Fig. 4. We randomly set the route and the destination for each user in our evaluation.

Fig. 4.
figure 4

Maps and routes used in the experiments (S denotes the start, D the destination, and the heavy line the route).

To measure the gaze direction while walking, we used the wearable device (Takei TalkEye Lite) shown in Fig. 5(i). The sampling rate of the device was 30 Hz. To measure the head pose while walking, we used the wearable device (MicroStrain 3DM-GX-25) shown in Fig. 5(ii). This device can measure the acceleration, gyroscope, and magnetic field. The sampling rate of the device was 100 Hz. The gaze direction is given on x and y axes of Fig. 6(i). The head pose is represented by the roll, pitch, and yaw axes of Fig. 6(ii). The origin of the axes corresponds to the state that the user stands upright and looks straight ahead. We measured the relative angle from the origin.

Fig. 5.
figure 5

Equipment for measuring the gaze direction and head pose.

Fig. 6.
figure 6

Rotation axes for representing the gaze direction and head pose.

2.3 Evaluation with a Real Guide

Changes in Gaze Direction. Figure 7 shows the distributions of gaze movements while walking. The gray level corresponds to the frequency of the gaze in that direction, and \((X,Y)=(0,0)\) represents that the user was looking straight ahead. We normalized the number of samples of gaze directions for each user because each user walked at a different speed. We see that the frequencies around the center direction in N1–N5 are higher than those for other directions. We believe that this phenomenon is due to the effect of the center bias [15]. Furthermore, we see that the real guide (N1) reduced the movements; i.e., the user looked straight ahead more for N1 than for N2–N5.

Table 1 gives that the averages and standard deviations of gaze directions computed from absolute values of angles. We see that the averages of N1 are smaller than those of N2–N5, and the standard deviation for the Y direction of N1 is smaller than those of N2–N5. We believe that the gaze direction is attracted to the center by the virtual guide.

Fig. 7.
figure 7

Distribution of gaze while walking using different navigation methods.

Table 1. Averages and standard deviations of absolute gaze directions [degrees].

Changes in Head Pose. Figure 8 shows the distribution of head movement while walking. We normalized the number of samples of head poses for each user because each user walked at a different speed. We unified the sum of the distribution for each method. We used the pitch axis to indicate the change in upward and downward movement of the head. The head faces the front for \(R=0\), downward for \(R>0\), and upward for \(R<0\). We see that the real guide (N1) reduced the head movement; i.e., frequencies around 0\(^{\circ }\) were higher than those of other angles. We believe that the real guide reduces the vertical movement of the head of the user looking at the map.

Table 2. Average and standard deviation of the pitch axis of the absolute head pose [degrees].

Table 2 gives the averages and standard deviations of head poses computed from the absolute angle of the pitch axis. We see that the average and standard deviation of N1 are lower than those of N2–N5. A previous report [5] described that the burden of the neck vertebrae increases to 2.3 kg when the head moves downward from 5\(^{\circ }\) to 10\(^{\circ }\). We believe that the head pose is attracted to the center by the virtual guide. Note that the average values for the angle about the yaw axis in N1–N5 were almost the same. The users sometimes moved their head horizontally to navigate obstacles when walking. The averages of the angle about the roll axis were zero in N1–N5. The users did not tilt their head when walking.

Fig. 8.
figure 8

Distributions of the head pitch for the different navigation methods.

3 Navigation System Using a Virtual Guide

3.1 Design of Our Navigation System

Head and gaze movements were reduced when using a real guide as described in the previous section, and we thus believe that the following effect decreases the physical burden. We therefore designed a novel navigation system that uses a virtual guide superimposed on the real world. Our system displays a virtual guide to the users on an HMD, as illustrated in Fig. 9. We compared our system with four navigation systems in terms of effectiveness, as illustrated in Fig. 10.

  • V1: (Virtual guide) A virtual guide walked in front of the user to indicate the travel direction.

  • V2: (Arrow) A virtual arrow appeared at junctions to indicate the travel direction.

  • V3: (Current position) A virtual map was provided with the current position always shown.

  • V4: (Combination) The user used a combination of V2 and V3.

We provided the users with a printed map containing the destination and route for all conditions. The above systems are alternatives of N1–N4 described in Sect. 3.1. Virtual guides are often used in augmented reality applications [1, 11, 12], and virtual arrows have been used in a map application [7]. We investigate whether a virtual guide reduces the physical burden of users while walking in an outdoor environment.

Fig. 9.
figure 9

Overview of our navigation system using the virtual guide.

Fig. 10.
figure 10

Navigation systems for evaluating the following effect.

3.2 Evaluation Protocol of the Virtual Guide

To evaluate the navigation systems, our system controlled the guide or the arrow using the Wizard of Oz technique [3]. We used an optical see-through HMD (EPSON MOVERIO BT-200). In stably overlaying the guide or the arrow on the real world while the user walked, we used an operator to control the display position of the guide as illustrated in Fig. 11. We acquired the head pose of the user using the acceleration and gyro sensors built into the HMD. After showing the guide or arrow adjusted to the head pose, the operator modified the positions when there was an error. The operator followed 2 m behind the user, and modified the display position by monitoring a live video sequence obtained from a camera built into the HMD. In V1, we adjusted the parameters of the virtual guide such that the virtual guide was 170 cm in height and displayed about 5 m in front of the user. In V2, we displayed an arrow 5 m before reaching a junction. In V3, we showed a map and the current position such that the visibility of the user was not disturbed.

Fig. 11.
figure 11

Operator and user with equipment for evaluation.

Twelve participants (average age ± standard deviation of 21.5 ± 0.4, 10 males, two females) acted as the user. We used the same routes described in Sect. 2.2, and randomly set the route and destination. We used the same sensor described in Sect. 2.2 to measure the head pose of the user. Note that we did not acquire the gaze direction of the user because the gaze sensor disturbed the HMD.

3.3 Measurement of the Head Pose with the Virtual Guide

Figure 12 shows the distribution of the head movement around the pitch axis while walking using navigation systems V1–V4. We used the same axes and evaluation protocols described in Sect. 2.3. We see that the peaks of the histograms for V1 increased near the frontal direction. We believe that the virtual guide reduced the head movements of the users. We also see that the head faced downward more frequently for V2–V4 than for V1. We believe that the users more frequently checked the printed map during their journey for V2–V4 than for V1. Note that the angle about the yaw axis was concentrated around 0\(^{\circ }\), and there was no apparent difference in the angle around the roll axis.

Fig. 12.
figure 12

Distributions of the head pitch while walking using the four navigation methods.

Table 3. Averages and standard deviations of absolute head poses [degrees].

Table 3 gives the averages and standard deviations of absolute angles of head poses around the pitch axis. The table confirms that the average and standard deviation were lower for V1 than for V2–V4, and changes in the vertical direction were small for V1. We observed that the virtual guide superimposed on the real world reduced the head movement of users while walking.

4 Conclusion

We revealed that the following effect induced by a real guide decreased the physical burden of a walking user, by measuring the gaze and head movements of users. We also developed a navigation system using a virtual guide and evaluated the reduction in head movement.

As part of our future work, we will expand our analysis to long-term experiments on the proposed navigation system and develop a method that assists users to avoid hazards.