• Title/Summary/Keyword: Motion Capture Animation

Search Result 123, Processing Time 0.024 seconds

Detecting Collisions in Graph-Driven Motion Synthesis for Crowd Simulation (군중 시뮬레이션을 위한 그래프기반 모션합성에서의 충돌감지)

  • Sung, Man-Kyu
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.35 no.1
    • /
    • pp.44-52
    • /
    • 2008
  • In this paper we consider detecting collisions between characters whose motion is specified by motion capture data. Since we are targeting on massive crowd simulation, we only consider rough collisions, modeling the characters as a disk in the floor plane. To provide efficient collision detection, we introduce a hierarchical bounding volume, the Motion Oriented Bounding Box tree (MOBB tree). A MOBBtree stores space-time bounds of a motion clip. In crowd animation tests, MOBB trees performance improvements ranging between two and an order of magnitude.

Motion Capture System for Digital Entertainment (디지털 엔터테인먼트에서의 모션 획득 시스템)

  • Lee, Man-Woo;Kim, Soon-Gohn
    • The Journal of the Korea Contents Association
    • /
    • v.7 no.5
    • /
    • pp.85-93
    • /
    • 2007
  • The motion capture system has shown its great potential as a new image expression means for handling such challenging tasks as realistic animation of humans or animals in motion, which cannot be handled by the existing key frame method satisfactorily, and also projects involving a large scale or a burdensome economic expenses. Its applications also has been intensified and widened in the entertainment arena including motion pictures, TV, advertisements, documentaries, music videos, etc. centering around games. Despite of these merits, though, a number of issues have been surfaced in the digital image expression utilizing the motion capture system, such as a burdensome amount of preparatory work, the needs for attachment of markers and for remedial corrections of motion data, and the lack of trained manpower. We would like to present in this paper a new direction for making the digital image production more efficient, based on the extensive analysis of prior image production projects that used the motion capture system.

Comparative Analysis of 3D Tools Suitable for the Rotoscoping Cell Animation Production Process

  • Choi, Chul Young
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.16 no.3
    • /
    • pp.113-120
    • /
    • 2024
  • Recently, case presentations using AI functions such as ChatGPT are increasing in many industrial fields. As AI-based results emerge even in the areas of images and videos, traditional animation production tools are in need of significant changes. Unreal Engine is the tool that adapts most quickly to these changes, proposing a new animation production workflow by integrating tools such as Metahuman and Marvelous Designer. Working with realistic metahumans allows for the production of realistic and natural movements, such as those captured through motion capture data. Implementing this approach presents many challenges for production tools that adhere to traditional methods. In this study, we investigated the differences between the cell animation workflow and the computer graphics animation production workflow. We compared and analyzed whether these differences could be reduced by creating sample movements using character rigs in Maya and Cascadeur tools. Our results showed that a similar cell animation workflow could be constructed using the Cascadeur tool. To improve the accuracy of our conclusions, we created large, action-packed short animations to demonstrate and validate our findings.

Real-time Interactive Animation System for Low-Priced Motion Capture Sensors (저가형 모션 캡처 장비를 이용한 실시간 상호작용 애니메이션 시스템)

  • Kim, Jeongho;Kang, Daeun;Lee, Yoonsang;Kwon, Taesoo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.28 no.2
    • /
    • pp.29-41
    • /
    • 2022
  • In this paper, we introduce a novel real-time, interactive animation system which uses real-time motion inputs from a low-cost motion-sensing device Kinect. Our system generates interaction motions between the user character and the counterpart character in real-time. While the motion of the user character is generated mimicking the user's input motion, the other character's motion is decided to react to the user avatar's motion. During a pre-processing step, our system analyzes the reference motion data and generates mapping model in advance. At run-time, our system first generates initial poses of two characters and then modifies them so that it could provide plausible interacting behavior. Our experimental results show plausible interacting animations in that the user character performs a modified motion of user input and the counterpart character properly reacts against the user character. The proposed method will be useful for developing real-time interactive animation systems which provide a better immersive experience for users.

Data-driven Facial Expression Reconstruction for Simultaneous Motion Capture of Body and Face (동작 및 효정 동시 포착을 위한 데이터 기반 표정 복원에 관한 연구)

  • Park, Sang Il
    • Journal of the Korea Computer Graphics Society
    • /
    • v.18 no.3
    • /
    • pp.9-16
    • /
    • 2012
  • In this paper, we present a new method for reconstructing detailed facial expression from roughly captured data with a small number of markers. Because of the difference in the required capture resolution between the full-body capture and the facial expression capture, they hardly have been performed simultaneously. However, for generating natural animation, a simultaneous capture for body and face is essential. For this purpose, we provide a method for capturing the detailed facial expression only with a small number of markers. Our basic idea is to build a database for the facial expressions and apply the principal component analysis for reducing the dimensionality. The dimensionality reduction enables us to estimate the full data from a part of the data. We justify our method by applying it to dynamic scenes to show the viability of the method.

A study about the problems and their solutions in the production process of 3D character animation using optical motion capture technology (옵티컬 모션캡쳐 기술을 활용한 3D 캐릭터 애니메이션에서 제작과정상 문제점 및 해결방안에 관한 연구)

  • Lee, Man-Woo;Kim, Hyun-Jong;Kim, Soon-Gohn
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.831-835
    • /
    • 2006
  • Motion capture means the recording of movement of objects such as human beings, animals, creatures, machines, etc in a form applicable to computer. Since the motion capture system can be introduced to the fields where realistic movement of human beings and animals, which cannot be attained with an existing key frame method is required, large scale is necessary or economical burden exists, it has a merit and possibility of new expression. For these reasons, this method is increasingly used in the field of digital entertainment such as movie, TV, advertisement, documentary, music video, etc centering around the game. However, in spite of such an advantage, problems such as too much advance preparation work in digital image expressions using motion capture, marker attachment, compensation of motion data, motion retargeting and lack of professional human resources, etc. are becoming a prominent figure. Accordingly, this study intends to suggest the way of more effective production of motion capture digital image through finding the problems and their draft possible solutions in the production process based on the image production examples using motion capture.

  • PDF

Bipeds Animation Blending By 3D Studio MAX Script (맥스 스크립트를 이용한 바이페드 애니메이션 합성)

  • Choi, Hong-Seok;Jeong, Jae-Wook
    • Science of Emotion and Sensibility
    • /
    • v.12 no.3
    • /
    • pp.259-266
    • /
    • 2009
  • Today, the 3D character animation is easily accessible from most of the film such as an actuality film, animation, games, or advertising. However, such a smooth movement of characters is a result obtained by Key Frame operation which skilled animators worked with data obtained through expensive equipment such Motion Capture for a long time. Therefore, to modify or to give other effects is not easy. In some cases, character's action made according to the personal feeling could be different with universal expectations of audiences, because it might be not appropriate to make regulations generalized between character's action by animater's design and emotional reaction of audience. In this research, it is aimed to show the way which is easily to blend and modify 2-3 Biped animation data by offering the operation tools of 3-D rotation using 3D Studio MAX Script. By this tool E.A.M., we can have various researches for quantities relation of between walking and emotional reaction.

  • PDF

A Character Speech Animation System for Language Education for Each Hearing Impaired Person (청각장애우의 언어교육을 위한 캐릭터 구화 애니메이션 시스템)

  • Won, Yong-Tae;Kim, Ha-Dong;Lee, Mal-Rey;Jang, Bong-Seog;Kwak, Hoon-Sung
    • Journal of Digital Contents Society
    • /
    • v.9 no.3
    • /
    • pp.389-398
    • /
    • 2008
  • There has been some research into a speech system for communications between those who are hearing impaired and those who hear normally, but the system has been pursued in inefficient teaching ways in which existing teachers teach each individual due to social indifference and a lack of marketability. In order to overcome such a weakness, there appeared to be a need to develop contents utilizing 3D animation and digital technology. For the investigation of a standard face and a standard spherical shape for the preparation of a character, the study collected sufficient data concerning students in the third-sixth grades in elementary schools in Seoul and Gyeonggi, Korea, and drew up standards for a face and a spherical shape of such students. This data is not merely the basic data of content development for the hearing impaired, but it can also offer a standard measurement and a standard type realistically applicable to them. As a system for understanding conversations by applying 3D character animation and educating self-expression, the character speech animation system supports effective learning for language education for hearing impaired children who need language education within their families and in special education institutions with the combination of 3D technology and motion capture.

  • PDF

A study about the improvement plan in production processes of digital entertainment image using the motion capture system (모션캡쳐시스템을 활용한 디지털 엔터테인먼트 영상에서 제작과정상 개선 방안에 관한 연구)

  • Lee, Man-Woo;Yun, Deok-Un;Park, Jin-Seok;Kim, Soon-Gohn
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.824-828
    • /
    • 2006
  • Introduction of motion capture system to the field of digital entertainment paved the way to accelerate the development of 3D character animation all the more. Motion capture system has been developed of the level that it can capture the fierce motions of character particularly in the digital game image and improve the dynamic characteristics by capturing the movement of human muscle or express the human's true emotion by capturing wrinkles and expression on face. Such an extension of realistic expression enables them to be used increasingly in movie, TV, advertisement, music video, etc centering around the game industry in the field of digital entertainment. The fact is, however, that many difficulties are held in the image production process compared with the competing countries such as USA and Japan, owing to inferiorities in technical expertise and capital in the image production process using the local motion capture, insufficient professional human resources of motion capture and small size of local motion capture image market. Hence, this study intends to suggest the plan to improve the technical problems in terms of integrated motion capture system, motion capture professional human resources and motion capture in-house program development in the production process of digital entertainment image using the motion capture system by surveying local and overseas examples of image production.

  • PDF

Motion Retargetting Simplification for H-Anim Characters (H-Anim 캐릭터의 모션 리타겟팅 단순화)

  • Jung, Chul-Hee;Lee, Myeong-Won
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.10
    • /
    • pp.791-795
    • /
    • 2009
  • There is a need for a system independent human data format that doesn't depend on a specific graphics tool or program to use interoperable human data in a network environment. To achieve this, the Web3D Consortium and ISO/IEC JTC1 WG6 developed the international draft standard ISO/IEC 19774 Humanoid Animation(H-Anim). H-Anim defines the data structure for an articulated human figure, but it does not yet define the data for human motion generation. This paper discusses a method of obtaining compatibility and independence of motion data between application programs, and describes a method of simplifying motion retargetting necessary for motion definition of H-Anim characters. In addition, it describes a method of generating H-Anim character animation using arbitrary 3D character models and arbitrary motion capture data without any inter-relations, and its implementation results.