Advertisement

On Natural Motion Editing by a Geometric Mean Filter

  • Jin Ok Kim
  • Chang Han Oh
  • Chin Hyun Chung
  • Jun Hwang
  • Woongjae Lee
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2668)

Abstract

Recently, motion capture has become one of the most promising technologies in animation. Realistic motion data can be captured by recording the movement of a real actor with an optical or magnetic motion capture system. A motion library that is an archive of reusable motion clips is also commercially available. This paper deals with motion editing by a geometric mean filter. Since the captured motion has some noises that cause a jerky motion, it needs a smoothing process to make it natural. A geometric mean filter is proposed to produce natural motions without jerky motions. Experimental results show that the geometric mean filter can effectively remove noises that cause a jerky motion and it can guarantee the most natural motions among various spatial filters. This method could be applied to the various fields such as real time animation, virtual reality applications, 3D applications, and etc.

Keywords

Motion Capture Natural Motion Unit Quaternion Virtual Human Virtual Reality Application 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Sannier, G., Balcisoy, S., Thalmann, N.M., Thalmann, D.: A system for directing real-time virtual actors. The Visual Computer 15 (1999) 320–329CrossRefGoogle Scholar
  2. 2.
    Badler, N.I.: Animation 2000++. IEEE Transactions on Computer Graphics and Applications 20 (2000) 28–29CrossRefGoogle Scholar
  3. 3.
    Lee, B.R., Chung, C.H.: Uniform posture map algorithm to generate natural motion transitions in real-time. Journal of KISS 7 (2001) 549–558Google Scholar
  4. 4.
    Lee, B.R.: A Study of Inductive Motion Edit Methodologies using Uniform Posture Map. Ph. D. dissertation, Kwangwoon Univ. (2002) Dept. of Control and Instrumentation Eng.Google Scholar
  5. 5.
    Sun, H.C., Metaxas, D.N.: Automating gait generation. In: Proceedings of ACM SIGGRAPH 2001, Los Angeles, CA, ACM Press (2001) 261–270Google Scholar
  6. 6.
    Moeslund, T.B., Granum, E.: A survey of computer vision-based human motion capture. Computer Vision and Image Understanding: CVIU 81 (2001) 231–268zbMATHCrossRefGoogle Scholar
  7. 7.
    Molet, T., Boulic, R., Thalmann, D.: A real time anatomical converter for human motion capture. In: Proceedings of the Eurographics workshop on computer animation and simulation’ 96, Poitiers, France (1996) 79–94Google Scholar
  8. 8.
    Mas, R., Thalmann, D.: A hand control and automatic grasping system for synthetic actors. In: Proceedings of Eurographic’ 94, Carrara, Italy (1994) 167–178Google Scholar
  9. 9.
    Dow, D.E., Semwal, S.K.: Fast techniques for mixing and control of motion units for human animation. In: Proceeding of the second Paci.c conference on fundamentals of computer graphics. (1995) 229–241Google Scholar
  10. 10.
    Pullen, K., Bregler, C.: Motion capture assisted animation: Texturing and synthesis. In: Proceedings of ACM SIGGRAPH. (2002)Google Scholar
  11. 11.
    Arikan, O., Forsyth, D.: Interactive motion generation from examples. In: Proceedings of ACM SIGGRAPH 2002, ACM Press (2002)Google Scholar
  12. 12.
    Badler, N.: Virtual humans for animation, ergonomics, and simulation. In: IEEE Workshop on Non-Rigid and Articulated Motion, Puerto Rico (1997)Google Scholar
  13. 13.
    Morales, C.R.: Development of an XML web based motion capture data warehousing and translation system for collaborative animation projects. In: Proceedings of WSCG 2001 Conference. (2001)Google Scholar
  14. 14.
    Peflin, K.: Real time responsive animation with personality. IEEE Visualization and Computer Graphics 1 (1995) 5–15CrossRefGoogle Scholar
  15. 15.
    Thalmann, D.: The virtual human as a multimodal interface. In: Proceedings of the Working Conference on Advanced Visual Interfaces. (2000) 79–94Google Scholar
  16. 16.
    Welch, G., Bishop, G., Vicci, L., Brumback, S., Keller, K., Colucci, D.: Highperformance wide-area optical tracking: The hiball tracking system. Teleoperators and Virtual Environments 10 (2001)Google Scholar
  17. 17.
    Gonzalez, R.C., Woods, R.E.: Digital Image Processing. 2nd edn. Prentice-Hall, Upper Saddle River, NJ (2002)Google Scholar
  18. 18.
    Lee, J.: A hierarchical approach to motion analysis and synthesis for articulated figures. Journal of Parallel and Distributed Computing (2000)Google Scholar
  19. 19.
    Lee, J., Shin, S.Y.: General construction of time-domain filters for orientation data. IEEE Transactions on Visualization and Computer Graphics 8 (2002) 119–128CrossRefGoogle Scholar
  20. 20.
    Fang, Y.C., Hsieh, C.C., Kim, M.J., Chang, J.J., Woo, T.C.: Real time motion fairing with unit quaternions. Computer-Aided Design 30 (1998) 191–198zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Jin Ok Kim
    • 1
  • Chang Han Oh
    • 2
  • Chin Hyun Chung
    • 2
  • Jun Hwang
    • 3
  • Woongjae Lee
    • 3
  1. 1.School of Information and Communication EngineeringSungkyunkwan UniversitySuwon, Kyunggi-doKOREA
  2. 2.Department of Information and Control EngineeringKwangwoon UniversityNowon-gu, SeoulKOREA
  3. 3.Division of Information and Communication EngineeringSeoul Women’s UniversityNowon-gu, SeoulKOREA

Personalised recommendations