Developing Grasping Pre-Shaping in Virtual Environment based on Real Object Shape Analysis

  • Alexandru Itu
  • Andreea Beraru
  • Ionel Staretu

The research conducted in neuro-psychology showed that a prehensile movement can be decomposed in three main stages (figure 1.1): (1) visual location of the target, including object recognition, orientation determination and shape and size analysis from visual information, (2) reaching of the target including a ballistic movement, when the hand preshapes according to the shape and size of the object, and fine movements near the target, (3) grasping corresponding to the fingers adjustments against the target [10], [6]. In virtual environments we deal with the second and the third stage: preshaping and virtual grasping. For virtual grasping to be completed one has to have a precise match of the virtual gripper with the stereoscopic representation of the object.

In real world, in a dextrous manipulation, one of the major problems that remain to be solved, is the determination of an appropriate preshape for grasping given the object, under a given manipulation task. In order to have a successful grasp execution, the system should have a successful grasp preparation at preshaping phase. Hence, the grasp preparation or preshaping is one of the critical issues for manipulation with a dextrous robot hand [1] The hand preshaping has usually been described by terms such as hand aperture [2] and hand orientation [4]. The term ‘hand aperture’ describes hand opening during the approaching phase of grasping and is defined as the distance between the thumb and the index finger. The advantage of such a description is its simplicity. Despite the fact that the thumb and index finger probably have an important role in grasping, it should not be forgotten that most of the grasping modes require the cooperation of all five digits and not only the thumb and the index finger. Therefore, such a definition of hand aperture unjustifiably ignores the influence of other fingers but is useful because it can be refined using the concept of virtual fingers [9]. A virtual finger corresponds to a set of real fingers moving together (there are two virtual fingers when grasping a glass, one is the thumb and the other consists of 4 remaining fingers). Using the concept of virtual fingers, the hand aperture is the distance between the thumb and the virtual finger.


Virtual Environment Virtual Object Gripper Aperture Stereoscopic Image Virtual Finger 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Baysal CV, Erkmen AM (2004) Preshape induced grasp controller for dextrous manipulation with a multifingered robot hand. Mechatronics. Proceedings of the IEEE International Conference, vol. Issue, 3-5 June 2004, pp 78 - 83Google Scholar
  2. 2.
    Christian Bard, Jocelyne Troccaz, Gianni Vercelli (1991) Shape analysis and hand preshaping for grasping. IEEE/RSJ International Workshop on Intelligent Robots and Systems, Nov 3-5 1991, Japan, pp 64-69Google Scholar
  3. 3.
    Cruz-Neira C, Sandin DJ, DeFanti ThA (1993) Surround-screen projectionbased virtual reality: the design and implementation of the CAVE. Proc. of the 20th annual conference on Computer graphics and interactive techniques’Google Scholar
  4. 4.
    Davis Wren, Robert B Fisher (1995) Dextrous hand grasping strategies using preshapes and digit trajectories. IEEE Int. Conf. on Systems, Man and CyberneticsGoogle Scholar
  5. 5.
    Haggard P, Wing AM (1998) Coordination of hand aperture with the spatial path of hand transport. Exp Brain Res; 118:286-92CrossRefGoogle Scholar
  6. 6.
    JR Napier (1956) The prehensile movement of the human hand. J. Bone Surgery, 38B:902-913Google Scholar
  7. 7.
    Jeannerod M (1981) Intersegmental coordination during reaching at natural objects. In: Long J, Baddeley AD, (eds) Attention and performance, vol. IX. Hillsdale, New York: Erlbaum; pp 153-69Google Scholar
  8. 8.
    LL Cao, Liu JZ, Tang X (2006) 3D object retrieval using 2D line drawing and graph based relevance feedback. Proc. ACM Int. Conf. on Multimedia, Oct. 2006Google Scholar
  9. 9.
    Lyons D (1985) A simple set of grasps for a dextrous hand. Robotics and Automation. Proceedings,IEEE International ConferenceGoogle Scholar
  10. 10.
    Sara A Winges, Douglas J Weber, Marco Santello (2003) The role of vision on hand preshaping during reach to grasp. Exp Brain Res 152: 489-498CrossRefGoogle Scholar
  11. 11.
    Tamara Supuk, Timotej Kodek, Tadej Bajd (2005) Estimation of hand preshaping during human grasping. Medical Engineering & Physics 27,790-797CrossRefGoogle Scholar
  12. 12.
    VD Nguyen (1987) Constructing stable grasps in 3D. In IEEE International Conference on Robotics and Automation, vol. 1Google Scholar

Copyright information

© Springer Science + Business Media B.V 2008

Authors and Affiliations

  • Alexandru Itu
    • 1
  • Andreea Beraru
    • 1
  • Ionel Staretu
    • 1
  1. 1.Product Design and Robotics DepartmentTransilvania University of BrasovRomania

Personalised recommendations