Advertisement

Vibration and Subsequent Collision Simulation of Finger and Object for Haptic Rendering

  • Shoichi HasegawaEmail author
  • Yukinobu Takehana
  • Alfonso Balandra
  • Hironori Mitake
  • Katsuhito Akahane
  • Makoto Sato
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8619)

Abstract

Humans can discriminate object’s materials [5, 7, 9] and tapping position [8] perceiving tapping vibrations. Susa et al. [4] proposed to simulate natural vibration of object to present arbitrary structured objects. However, the vibration of the tapping finger and subsequent collisions between the finger and the object are not simulated.

This paper proposes a simulation model for tapping, which considers finger’s vibration motion and subsequent collisions between the object and the finger. Experimental results show that the proposed method renders realistic event based forces including impact impulse, decayed waves and subsequent collisions.

Keywords

Vibration haptic rendering Simulation model for tapping Subsequent collision 

References

  1. 1.
    Akahane, K., Hasegawa, S., Koike, Y., Sato, M.: A proposal of a high definition haptic rendering for stability and fidelity. ICAT2006, pp. 162–167, Nov 2006Google Scholar
  2. 2.
    Bensmaıa, S.J., Hollins, M.: Complex tactile waveform discrimination. J. Acoust. Soc. Am. 108(3), 1236–1245 (2000)CrossRefGoogle Scholar
  3. 3.
    Ikeda, Y., Hasegawa, S.: Short paper: characteristics of perception of stiffness by varied tapping velocity and penetration in using event-based haptic. Joint Virtual Reality Conference EGVE-ICAT-EURO VR, pp. 113–116 (2009)Google Scholar
  4. 4.
    Susa, I., Takehana, Y., Balandra, A., Mitake, H., Hasegawa, S.: Haptic rendering based on finite element simulation of vibration. In: IEEE Haptics Symposium (2014)Google Scholar
  5. 5.
    Kuchenbecker, K.J., Fiene, J., Niemeyer, G.: Improving contact realism through event-based haptic feedback. IEEE Trans. Vis. Comput. Graph. 12(2), 219–230 (2006)CrossRefGoogle Scholar
  6. 6.
    Okamura, A.M., Cutkosky, M.R., Dennerlein, J.T.: Reality-based models for vibration feedback in virtual environments. IEEE/ASME Trans. Mechatron. 6(3), 245–252 (2001)CrossRefGoogle Scholar
  7. 7.
    Okamura, A.M., Dennerlein, J.T., Howe, R.D.: Vibration feedback models for virtual environments. In: Proceedings of the 1998 IEEE International Conference on Robotics and Automation, pp. 674–679, May 1998Google Scholar
  8. 8.
    Sreng, J., Lécuyer, A., Andriot, C.: Using vibration patterns to provide impact position information in haptic manipulation of virtual objects. In: Ferre, M. (ed.) EuroHaptics 2008. LNCS, vol. 5024, pp. 589–598. Springer, Heidelberg (2008) CrossRefGoogle Scholar
  9. 9.
    Wellman, P., Howe, R.D.: Towards realistic vibrotactile display in virtual environments. In: Alberts, T. (ed.) Proceeding of the ASME Dynamics Systems and Control Division, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, p. 57-2 (1995)Google Scholar
  10. 10.
    Zilles, C.B., Salisbury, J.: A constraint-based god-object method for haptic display. In: Proceedings of the 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems 95. ‘Human Robot Interaction and Cooperative Robots’, vol. 3, pp. 146–151 (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Shoichi Hasegawa
    • 1
    Email author
  • Yukinobu Takehana
    • 1
  • Alfonso Balandra
    • 1
  • Hironori Mitake
    • 1
  • Katsuhito Akahane
    • 1
  • Makoto Sato
    • 1
  1. 1.Tokyo Institute of TechnologyYokohamaJapan

Personalised recommendations