Advertisement

Development of Virtual Draping System by Augmented Reality

  • Shigeru Inui
  • Takuto Watahiki
  • Goro Karasawa
  • Yosuke Horiba
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 739)

Abstract

The purpose of this study is to virtualize draping which is one of the methods to design paper patterns for clothing. It takes time and costs to make made-to-order clothing by draping method. If draping is virtualized, the processes to make made-to-order clothing become much more efficient. In our study, the components of the method such as cloth, hand and dress form are modeled. The cloth is modeled mechanically, and its dynamic shape is numerically calculated. Hand, also modeled, a sensor detects the motion of the hand or fingers in the real world, the data of the motion is transferred to a computer and the motion is reflected to the hand model. The dress form is also modeled as the basis to apply the cloth model to the surface of the dress form model. The intrusions of an element into another element are prevented by the detection and reaction of the collisions between those elements. The user of the system can manipulate the cloth model in the virtual world by moving hand or fingers in the real world. We are constructing a system in virtual environment. In the environment, the image of the real world is also projected. We are trying to superimpose virtual object on real object to improve the workability of the system.

Keywords

Virtual Draping Cloth Model Finger Motion Sensor Dress Form Model 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Harada, T., Koshizuka, S.: Real-time collision computation between cloth and highly-tes-sellated models. Transaction of Information Processing Society of Japan, 48(4), 1829-1837 (2007).Google Scholar
  2. 2.
    Meng, Y., Mok, Y. P., Jin, X.: Interactive virtual try-on clothing design systems. Computer-Aided Design, 42(4), 310-321 (2010).Google Scholar
  3. 3.
    Igarashi, T., Hughes, F. J.: Clothing Manipulation, ACM TRANSACTIONS ON GRAPHICS, 22 (3), 697-697 (2003).Google Scholar
  4. 4.
    Wibowo, A., Sakamoto, D., Mitani, J., Igarashi, T., DressUp: A 3D Interface for Clothing Design with a Physical Mannequin, Proceedings of the 6th International Conference on Tan-gible, Embedded and Embodied Interaction (TEI 2012), Ontario, Canada, 99-102 (2012).Google Scholar
  5. 5.
    Satoh, K., Yamamoto, T., Matsuyama, K., Konno, K.: Implementation and Examination of Gestural Modeling Interface. IPSJ Interaction 2012, 771-776 (2012).Google Scholar
  6. 6.
    Liang, S., Baciu, G., Li, R. H., Jia, J. Y., Zhang, J. H.: Recommendation in Motion: Intelli-gent Hypertouch Garment Design, Advances in Mechanical Engineering, 610604, 1-9 (2013).Google Scholar
  7. 7.
    Mesuda, Y., Inui, S., Horiba, Y., : Virtual manipulations for Draping, International Journal of Clothing Science and Technology, 27(3), 417-433 (2015).Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  • Shigeru Inui
    • 1
  • Takuto Watahiki
    • 1
  • Goro Karasawa
    • 1
  • Yosuke Horiba
    • 1
  1. 1.Shinshu UniversityUeda NaganoJapan

Personalised recommendations