Abstract
The future of human space exploration will involve extra-vehicular activities (EVA) on foreign planetary surfaces (i.e. Mars), an activity that will have significantly different characteristics than the common exploration scenarios on Earth. The required use of a bulky, pressurized EVA suit perceptually disconnects human explorers from the hostile foreign environment, increasing the navigation workload and risk of collision associated with traversing through unfamiliar, rocky terrain. To assist the explorer in such tasks, multi-modal information presentation devices are being designed and evaluated. One application is to assist astronauts in ground obstacle avoidance via tactile channels of the feet. Before utilizing these signals as a form of information presentation, it is necessary to first characterize the tactile perception capabilities of the feet for selected vibration location and signal types, in particular during distracted attention states. The perception of tactile signals must be robust under various cognitive loads as the user will be involved in multiple tasks. The current study consisted of participants completing a vibrotactile detection study, with independent variables of attention state, vibration location and vibration signal type. Tactile cues were provided using haptic motor vibrations at six different locations on each foot for four different vibration levels (High, Low, Increasing and Decreasing), resulting in 24 unique vibrations per foot. Each treatment was repeated six times per attention state and vibrations were presented randomly within a time window of 2–7 s. After each trial, participants indicated the location and level of the vibration perceived. Accuracy of response was analyzed across conditions and results provide implications for the presentation of tactile information on the feet under varying attention states.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Manned missions to Mars and the Moon will involve space and surface operations that impose much higher risk and workload on astronauts than similar activities on Earth. The most complex of these operations are those involving Extra-Vehicular Activity (EVA), which occurs when astronauts exit the protective environment of the spacecraft and enter the vacuum of space or a thin atmosphere of another planet while wearing a bulky spacesuit. These activities become challenging due to restricted visual cues, the absence of auditory information, and the restrictions placed on somatosensory and proprioceptive feedback from altered gravity and the pressurized suit [1].
Surface EVA operations will include assembly and construction of structures, geologic exploration and protective shelter excavation [2]. Such activities come with inherent risk of injury or damage to life-support equipment (i.e. the suit) since trips and falls are likely to occur on unfamiliar, rocky terrain. Due to the physical nature of these tasks in combination with restricted time, resources, and perceptual capabilities of crewmembers, there is a critical need to design multimodal interfaces for optimizing task performance and minimizing risks in such conditions [2]. Of particular interest is an information presentation device that can aid in obstacle avoidance during surface exploration and way-finding [3]. To assist astronauts in safely navigating to another crewmember, shelter or rover while traversing through rugged terrain, multi-modal information presentation devices are being designed and evaluated [3]. While there is a large body of research across different applications that is integrating information via auditory and visual channels, tactile channels for information mapping has not been thoroughly characterized yet, and could be of great use in intuitively conveying alerts about surface features, inclination, and obstacles in a path. Srikulwong and Oneill [4] highlighted demonstrations of tactile displays that can aid in navigation, target detection, and overall situation awareness in operational settings.
Since haptic technologies are relatively new and their applications in mapping information to human feet in particular are not well studied, it isn’t clear what set of magnitudes, frequencies, or locations would be best when using vibrotactile signals to convey information. Compared to visual and auditory channels, the perception thresholds and cognitive interactions for tactile signals are less understood [5]. Vibrotactile displays for alerts, bodypart orientation, and directional navigation tasks have been implemented successfully on the arms [6–10], shoulder [11], waist [12–16] and through a body suit [17], but there have been very few applications utilizing the feet. One study successfully demonstrated the ability of a sandal-like vibration interface to promote and maintain a specific walking pace [18] and another used vibrating toe rings to signal direction changes while walking towards a preset destination [19], but none thus far have used tactile signals on the feet for aiding in obstacle avoidance while walking through rough and unknown terrain. Previous vibrotactile navigation studies have used haptic signals to command direction changes (right/left) or inform approximate distances to a destination, but none have tried to convey information about small obstacles directly in one’s walking path (e.g. a rock that needs to be stepped over or moved around), which may require increased information presentation on this sensory channel.
Before vibrotactile signals can be implemented in a multi-modal navigational aid, it is necessary to first understand the tactile perception capabilities of the feet for the locations and signal types under consideration, in particular during states of divided attention. The detection of the perceived tactile signal must be robust enough to withstand various cognitive loads since the practical use of such an interface would undoubtedly occur while the user is multi-tasking. Load theory [20–22] suggests that perceptual and cognitive demands (or loads) have a limited capacity beyond which selective attention can fail, negatively affecting sensory perception or cognitive performance. Fitousi and Wenger [23] emphasize that research attributes selective attention and performance failures to such mental capacity limitations. In the case of an astronaut under high cognitive load from multi-tasking, it’s therefore imperative that the vibrotactile stimuli carrying critical information be stimulating enough to be perceived during narrowed attentional focus.
The sensory systems of the feet enable sufficient vibrotactile perception using off-the-shelf tactors [24]. To better understand how to incorporate tactile signals for robust signal detection in the assistive device of interest, the current study examined four types of vibrations at six different locations per foot under varying attention loads. Independent variables consisted of vibration signal type (High, Low, Increase, Decrease), location (1–6), attention state (Focused or Distracted), foot (Right or Left) and order of attention condition assignment. The dependent variables were perceived location accuracy (ability to detect vibration at a specific location), and perceived vibration type accuracy (ability to detect the type of vibration signal). It was hypothesized that attentional load would negatively affect perception accuracy, and that certain locations and vibration types may be more detectable than others.
2 Methods
2.1 Participants
The participants consisted of ten healthy adults (3 females, 7 males) between the ages of 19 and 27 (M = 23.3, SD = 2.4). The experimental protocol was approved by the MIT Committee on the Use of Humans as Experimental Subjects (COUHES) and all participants provided written consent. Participants were excluded from the study if they reported irregularity or abnormalities with tactile perception on the feet or any injuries to the lower extremities. Participants were mostly right-handed (8 of 10), while one was left-handed and one was ambidextrous.
2.2 Materials
A custom haptic display was developed that applied four kinds of vibrations at six locations on each foot: one on the tip of the big toe, three on the lateral side, one on the back of the heel, and one in the center of the medial side of the foot (Fig. 1). Vibrations were created by small haptic motors (Vibrating Mini Motor Disc, Adafruit, New York City, NY) of 10 mm diameter and 2.7 mm thickness. All vibrations were 1.5 s in duration and consisted of one of the four vibration levels at an amplitude of roughly 0.8–1.2 g: High (11000 RPM), Low (2750 RPM), Increase and Decrease. The Increase vibration level went from the Low level to the High level and the Decrease vibration level did the opposite. The haptic motors were each controlled by an individual driver board (DRV2605, Texas Instruments, Dallas, TX) that received input in the form of a Pulse Width Modulation (PWM) signal from a microcontroller (Arduino UNO, Arduino, Massimo, Italy). The High vibration level had a 100 % duty cyle at 5 V while the Low vibration level had a 25 % duty cycle at 5V. The haptic motors were placed on participants’ feet with double-sided tape and reinforced with athletic wrap for the duration of the experiment. Participants used a custom graphical user interface (GUI) that commanded the motors via serial ports and recorded participant responses for each trial.
2.3 Experimental Protocol
Participants completed the experiment while in a focused state of attention (Focused condition) and in a distracted state of attention (Distracted condition), where the order of these conditions were counterbalanced (i.e. participants completed the trials for one or the other first). During the Focused trials, participants were instructed to focus on their feet and pay close attention to the vibration sensations. For the Distracted trials, participants were presented a random number between 0 and 100 at the beginning of each trial and instructed to count up from that number in increments of three until they felt the vibration. Four different vibration levels for six different locations per foot results in 48 unique vibration combinations, and participants experienced each combination for six trials during each attention condition, totaling to 576 trials overall. The 288 trials for each attention condition (Focused or Divided) were randomized in a pre-determined order. The numbers used in the Divided trials were the same for all participants to maintain the total difficulty of each trial (e.g. some numbers may be harder to count up from). To ensure that participants could not predict when the vibration would occur during each trial, each vibration took place at a random point in time between 2–7 s after pressing the “Next Trial” button on the GUI. Once a vibration was felt, participants selected the location on the foot where they perceived the vibration and then selected the type of vibration felt. Each response was recorded and deemed correct or incorrect, but participants did not receive this feedback. If a participant did not perceive a vibration, they were instructed to select a button labeled “I didn’t feel it”, and that trial’s response was recorded as undetected.
3 Results
3.1 Data Analysis
The location responses were used to calculate perceived location accuracy scores while the vibration type responses were used to calculate perceived type accuracy scores. As a result, each participant had two separate accuracy scores for each of the 48 unique vibration combinations. In both cases, accuracy was calculated by dividing the number of correct responses by total trials for that combination (six). An undetected trial counted as an incorrect response in these calculations. A repeated-measures ANOVA was performed on each of the two accuracy scores, where main effects included attention state (Focused or Distracted), order of attention state, foot (Right or Left), location (1–6) and vibration type (High, Low, Increase, Decrease). The Significance level was \(\alpha = 0.05\).
3.2 Perceived Location Accuracy
Results showed no statistically significant effects on perceived location accuracy (\(p>0.05\) for all main effects and interactions). Participants performed well in discriminating the locations where vibrations occurred (Fig. 2). The most common errors were confusing location 3 with neighboring locations or not detecting vibrations at location 3 (specifically during Low vibrations); although, these errors were highly subject dependent, which explains the large standard error.
3.3 Perceived Type Accuracy
Results showed significant main effects of Attention, Type, and several interactions between Location, Type and Foot (Table 1). Participants had higher average accuracy in discriminating vibration type during Focused trials (M = 79.82 %, SE = 3.35 %) compared to Distracted trials (M = 70.76 %, SE = 4.47 %). Accuracy was lower for dynamic vibrations (Increase and Decrease) compared to the static vibrations (High and Low), yet these differences are less consistent across locations on the Right foot (Fig. 3).
3.4 Undetected Vibrations
Vibrations at certain locations were undetected and occurred mostly during the Low vibration types (Table 2). In addition, the undetected responses were unevenly distributed across participants. It is important to take these data into consideration while examining the statistical interactions between Type, Location and Foot.
4 Discussion and Conclusions
The current study examined the effect of attention load, foot, location and vibration type on haptic foot perception. It was hypothesized that distracted attention states would decrease perception accuracy due to limited attentional resources, and that certain vibrations types would be easier to detect than others. The experimental data support a decrease in perceived vibration type accuracy due to attentional state, and show that distinct locations and vibration types could be more accurately perceived than others. The data also provides insight on location and vibration type perception. Overall, these results provide useful implications for vibrotactile interface design for the feet.
Results show that tactile perception is degraded during distracted states of attention. This has implications when designing interfaces that map critical information to sensory signals. Detection of time critical signals must be robust enough to eliminate ambiguity and human-device incompatibility risks. While attention state affected perceived vibration type accuracy in the current study, it did not significantly affect perceived location accuracy. Therefore, when designing haptic information presentation devices, it may be more reliable to use location to convey critical information while other details in the signal can provide additional sub-critical information.
With average perceived location accuracy scores close to 100 % for most foot locations, vibrations at these locations are promising for applications in tactile displays. Although, limitations exist with the locations on the lateral side of the foot. For certain individuals, sensation in this area may not have fine enough location resolution, and the lateral longitudinal arch (Location 3) may be the least sensitive, as a few participants often confused it with neighboring Locations 2 or 4 on both feet. The subject-specificity of haptic thresholds on other foot locations has been observed in previous research [25]. It’s also important to consider that Low vibration types at this location on the Right foot were undetected quite often by most participants, which contributes to the lower average perceived location accuracy for Location 3 on the Right foot. Going forward, it may be favorable to only include one or two locations on the lateral side of the foot for vibrotactile inputs (i.e. Locations 2 and/or 4).
User abilities in perceived vibration type accuracy vary greatly by type, foot and location (Fig. 3). Participants reported occasional difficulty in detecting the difference between High vibrations and the dynamic vibrations (Increase and Decrease), which is observed in their trial responses. Participants stated that they usually selected High during these moments of confusion, which is consistent with lower accuracy for dynamic vibrations, with Decrease types being the hardest to distinguish. Participants reported that when a vibration started at the High level and then decreased, it was harder to detect this change than in the case for an increasing vibration, which could be due to the cutaneous sensation of the High level overpowering the proceeding lower levels. These results suggest that decreasing vibration signals are not adequately detectable and should not be used for critical information presentation. It is hypothesized that if the Decrease type was removed from the current study, perceived type accuracy for High vibrations may have been higher than those for Low vibrations, especially since High levels were rarely undetected. Regarding the Increase vibration type, distinguishing these from High types may be easier than is the case for Decrease types, but there is still insufficient evidence that it’s reliable enough to convey critical information in a haptic interface application.
Vibrations were rarely undetected except for those of type Low at Location 3 on the Right foot, which had an unusually high occurrence (about 20 % of all R3-Low trials). It is unclear why the undetected signals are only for the Right foot. The vast majority of participants were right-handed, so it is possible that the lateral side of the dominant foot is less sensitive and has a higher tactile detection threshold, but this is just speculation. This phenomenon could also be device related (e.g. the motor at this location malfunctioned during certain Low vibrations), but is unlikely since there were not detection issues with other vibration types at this location. The Low, Increase, and Decrease vibration types on the Right foot appear to have more variability in perceived accuracy type across locations, suggesting that the perception capabilities of the Right and Left feet do differ in some ways.
Overall, results demonstrate that the haptic perception capabilities of the feet for selected locations and vibration types is sufficient for use in a vibrotactile interface. High and Low vibrations are successfully perceived at most of the locations studied, and will be implemented in future studies. Careful consideration should be taken when utilizing quickly increasing/decreasing vibrations or locations on the lateral side of the foot. Most importantly, signal perception should be robust enough to withstand attentional loading, so haptic signal location should convey the critical information while the more subtle signal properties can supplement with less critical information. Future work should examine alternate dynamic patterns, such as pulsing vibrations, as well as examine how tactile foot perception is affected by body motion (i.e. walking or running). The results of the current study will guide design of a multi-modal device for obstacle avoidance, where sensory reinforcement via visual channels may be incorporated.
While the current application of interest involves a wearable interface for obstacle avoidance, the growing range of computing devices, computational power and input/output capabilities opens doors for numerous other applications in human-computer interaction [5]. The integration of haptic communications in technology presents novel applications in teaching/training, telerobotics, entertainment and gaming [26].
References
NASA. Std-3000. man systems integration standards. National Aeronautics andSpace Administration, Houston, USA (1995)
Godfroy, M., Wenzel, E.M.: Human dimensions in multimodal wearable virtual simulators for extra vehicular activities. In: Proceedings of the NATO Workshop on Human Dimensions in Embedded Virtual Simulation, Orlando, FL (2009)
Holden, K., Ezer, N., Vos, G.: Evidence report: risk of inadequate human-computer interaction. NASA Human Research Program: Space Human Factors and Habitability (2013)
Srikulwong, M., O’Neill, E.: A comparative study of tactile representation techniques for landmarks on a wearable device. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2029–2038. ACM (2011)
Sebe, N., Jaimes, A.: Multimodal human-computer interaction: a survey. Comput. Vis. Image Underst. 108, 116–134 (2007)
Scheggi, S., Morbidi, F., Prattichizzo, D.: Human-robot formation control via visual and vibrotactile haptic feedback. IEEE Trans. Haptics 7(4), 499–511 (2014)
Sergi, F., Accoto, D., Campolo, D., Guglielmelli, E.: Forearm orientation guidance with a vibrotactile feedback bracelet: on the directionality of tactile motor communication. In: 2008 2nd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, BioRob 2008, pp. 433–438. IEEE (2008)
Matscheko, M., Ferscha, A., Riener, A., Lehner, M.: Tactor placement in wrist worn wearables. In: 2010 International Symposium on Wearable Computers (ISWC), pp. 1–8. IEEE (2010)
Guo, W., Ni, W., Chen, I., Ding, Z.Q., Yeo, S.H., et al.: Intuitive vibro-tactile feedback for human body movement guidance. In: 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 135–140. IEEE (2009)
Stanley, A.A., Kuchenbecker, K.J.: Evaluation of tactile feedback methods for wrist rotation guidance. IEEE Trans. Haptics 5(3), 240–251 (2012)
Bosman, S., Groenendaal, B., Findlater, J.-W., Visser, T., de Graaf, M., Markopoulos, P.: GentleGuide: an exploration of haptic output for indoors pedestrian guidance. In: Chittaro, L. (ed.) Mobile HCI 2003. LNCS, vol. 2795, pp. 358–362. Springer, Heidelberg (2003)
Van Erp, J.B.F., Van Veen, H.A.H.C., Jansen, C., Dobbins, T.: Waypoint navigation with a vibrotactile waist belt. ACM Trans. Appl. Percept. (TAP) 2(2), 106–117 (2005)
Tsukada, K., Yasumura, M.: ActiveBelt: belt-type wearable tactile display for directional navigation. In: Mynatt, E.D., Siio, I. (eds.) UbiComp 2004. LNCS, vol. 3205, pp. 384–399. Springer, Heidelberg (2004)
Srikulwong, M., O’Neill, E.: Wearable tactile display of directions for pedestrian navigation: comparative lab and field evaluations. In: 2013 World Haptics Conference (WHC), pp. 503–508. IEEE (2013)
Flores, G., Kurniawan, S., Manduchi, R., Martinson, E., Morales, L.M., Sisbot, E.A.: Vibrotactile guidance for wayfinding of blind walkers. IEEE Trans. Haptics 8(3), 306–317 (2015)
Lee, B.-C., Martin, B.J., Sienko, K.H.: Comparison of non-volitional postural responses induced by two types of torso based vibrotactile stimulations. In: 2012 IEEE Haptics Symposium (HAPTICS), pp. 195–198. IEEE (2012)
Lieberman, J., Breazeal, C.: TIKL: development of a wearable vibrotactile feedback suit for improved human motor learning. IEEE Trans. Robot. 23(5), 919–926 (2007)
Watanabe, J., Ando, H.: Pace-sync shoes: intuitive walking-pace guidance based on cyclic vibro-tactile stimulation for the foot. Virtual Reality 14(3), 213–219 (2010)
IDEO. Technojewelry for ideo (2001). https://www.ideo.com/work/technojewelry
Lavie, N.: Distracted and confused?: selective attention under load. Trends Cogn. Sci. 9(2), 75–82 (2005)
Lavie, N.: Perceptual load as a necessary condition for selective attention. J. Exp. Psychol. Hum. Percept. Perform. 21(3), 451 (1995)
Lavie, N.: Selective attention, cognitive control: dissociating attentional functions through different types of load. Attention Perform. XVIII, 175–194 (2000)
Fitousi, D., Wenger, M.J.: Processing capacity under perceptual and cognitive load: a closer look at load theory. J. Exp. Psychol. Hum. Percept. Perform. 37(3), 781 (2011)
Trulsson, M.: Mechanoreceptive afferents in the human sural nerve. Exp. Brain Res. 137(1), 111–116 (2001)
Priplata, A., Niemi, J., Salen, M., Harry, J., Lipsitz, L.A., Collins, J.J.: Noise-enhanced human balance control. Phys. Rev. Lett. 89(23), 238101 (2002)
Steinbach, E., Hirche, S., Ernst, M., Brandi, F., Chaudhari, R., Kammerl, J., Vittorias, I.: Haptic communications. Proc. IEEE 100, 937–955 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Gibson, A., Webb, A., Stirling, L. (2016). User Abilities in Detecting Vibrotactile Signals on the Feet Under Varying Attention Loads. In: Schmorrow, D., Fidopiastis, C. (eds) Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience. AC 2016. Lecture Notes in Computer Science(), vol 9743. Springer, Cham. https://doi.org/10.1007/978-3-319-39955-3_30
Download citation
DOI: https://doi.org/10.1007/978-3-319-39955-3_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-39954-6
Online ISBN: 978-3-319-39955-3
eBook Packages: Computer ScienceComputer Science (R0)