• 제목/요약/키워드: human motion simulation

검색결과 182건 처리시간 0.028초

실시간 휴먼 시뮬레이션을 위한 깊이 카메라 기반의 자세 판별 및 모션 보간 (Depth Camera-Based Posture Discrimination and Motion Interpolation for Real-Time Human Simulation)

  • 이진원;한정호;양정삼
    • 한국CDE학회논문집
    • /
    • 제19권1호
    • /
    • pp.68-79
    • /
    • 2014
  • Human model simulation has been widely used in various industrial areas such as ergonomic design, product evaluation and characteristic analysis of work-related musculoskeletal disorders. However, the process of building digital human models and capturing their behaviors requires many costly and time-consuming fabrication iterations. To overcome the limitations of this expensive and time-consuming process, many studies have recently presented a markerless motion capture approach that reconstructs the time-varying skeletal motions from optical devices. However, the drawback of the markerless motion capture approach is that the phenomenon of occlusion of motion data occurs in real-time human simulation. In this study, we propose a systematic method of discriminating missing or inaccurate motion data due to motion occlusion and interpolating a sequence of motion frames captured by a markerless depth camera.

Realistic Visual Simulation of Water Effects in Response to Human Motion using a Depth Camera

  • Kim, Jong-Hyun;Lee, Jung;Kim, Chang-Hun;Kim, Sun-Jeong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제11권2호
    • /
    • pp.1019-1031
    • /
    • 2017
  • In this study, we propose a new method for simulating water responding to human motion. Motion data obtained from motion-capture devices are represented as a jointed skeleton, which interacts with the velocity field in the water simulation. To integrate the motion data into the water simulation space, it is necessary to establish a mapping relationship between two fields with different properties. However, there can be severe numerical instability if the mapping breaks down, with the realism of the human-water interaction being adversely affected. To address this problem, our method extends the joint velocity mapped to each grid point to neighboring nodes. We refine these extended velocities to enable increased robustness in the water solver. Our experimental results demonstrate that water animation can be made to respond to human motions such as walking and jumping.

인적요소를 고려한 선상 탈출 시뮬레이션 기술 (A Review of Simulation for Human Escape on Shipboard)

  • 김홍태;이동곤;박진형
    • 한국시뮬레이션학회:학술대회논문집
    • /
    • 한국시뮬레이션학회 2001년도 춘계 학술대회 논문집
    • /
    • pp.135-140
    • /
    • 2001
  • In the last years there have been some severe accidents with passenger vessels. So, International Maritime Organization(IMO) has recognized that computer stimulation of the evacuation may be required for passenger vessels. Human elements is a key issues of escape analysis on shipboard. There are technical requirements to simulate of escape analysis for human elements. Technical requirements include model of ship structure, evacuation algorithm, human behaviour analysis and influence of ship listing/motion. This paper provides the key issues and technologies of simulation for human escape on shipboard.

  • PDF

동작 특성 추출 : 동작 모방에 기초한 향상된 역 운동학 (Motion Characteristic Capturing : Example Guided Inverse Kinematics)

  • 탁세윤
    • 한국시뮬레이션학회:학술대회논문집
    • /
    • 한국시뮬레이션학회 1999년도 춘계학술대회 논문집
    • /
    • pp.147-151
    • /
    • 1999
  • This paper extends and enhances the existing inverse kinematics technique using the concept of motion characteristic capturing. Motion characteristic capturing is not about measuring motion by tracking body points. Instead, it starts from pre-measured motion data, extracts the motion characteristics, and applies them in animating other bodies. The resulting motion resembles the originally measured one in spite of arbitrary dimensional differences between the bodies. Motion characteristics capturing is a new principle in kinematic motion generalization to process measurements and generate realistic animation of human being or other living creatures.

  • PDF

동력학 모델을 이용한 인체 동작 제어 (Human Motion Control Using Dynamic Model)

  • 김창회;오병주;김승호
    • 대한인간공학회지
    • /
    • 제18권3호
    • /
    • pp.141-152
    • /
    • 1999
  • In this paper, We performed the human body dynamic modelling for the realistic animation based on the dynamical behavior of human body, and designed controller for the effective control of complicate human dynamic model. The human body was simplified as a rigid body which consists of 18 actuated degrees of freedom for the real time computation. Complex human kinematic mechanism was regarded as a composition of 6 serial kinematic chains : left arm, right arm, support leg, free leg, body, and head. Based on the this kinematic analysis, dynamic model of human body was determined using Newton-Euler formulation recursively. The balance controller was designed in order to control the nonlinear dynamics model of human body. The effectiveness of designed controller was examined by the graphical simulation of human walking motion. The simulation results were compared with the model base control results. And it was demonstrated that, the balance controller showed better performance in mimicking the dynamic motion of human walking.

  • PDF

충격과 ZMP 조건을 고려한 인체 모델의 착지 동작 해석 (Landing Motion Analysis of Human-Body Model Considering Impact and ZMP Condition)

  • 소병록;김희국;이병주
    • 제어로봇시스템학회논문지
    • /
    • 제11권6호
    • /
    • pp.543-549
    • /
    • 2005
  • This paper deals with modeling and analysis fer the landing motion of a human-body model. First, the dynamic model of a floating human body is derived. The external impulse exerted on the ground as well as the internal impulse experienced at the joints of the human body model is analyzed. Second, a motion planning algorithm exploiting the kinematic redundancy is suggested to ensure stability in terms of ZMP stability condition during a series of landing phases. Four phases of landing motion are investigated. In simulation, the external and internal impulses experienced at the human joints and the ZMP history resulting from the motion planning are analyzed for two different configurations. h desired landing posture is suggested by comparison of the simulation results.

Motion-capture-based walking simulation of digital human adapted to laser-scanned 3D as-is environments for accessibility evaluation

  • Maruyama, Tsubasa;Kanai, Satoshi;Date, Hiroaki;Tada, Mitsunori
    • Journal of Computational Design and Engineering
    • /
    • 제3권3호
    • /
    • pp.250-265
    • /
    • 2016
  • Owing to our rapidly aging society, accessibility evaluation to enhance the ease and safety of access to indoor and outdoor environments for the elderly and disabled is increasing in importance. Accessibility must be assessed not only from the general standard aspect but also in terms of physical and cognitive friendliness for users of different ages, genders, and abilities. Meanwhile, human behavior simulation has been progressing in the areas of crowd behavior analysis and emergency evacuation planning. However, in human behavior simulation, environment models represent only "as-planned" situations. In addition, a pedestrian model cannot generate the detailed articulated movements of various people of different ages and genders in the simulation. Therefore, the final goal of this research was to develop a virtual accessibility evaluation by combining realistic human behavior simulation using a digital human model (DHM) with "as-is" environment models. To achieve this goal, we developed an algorithm for generating human-like DHM walking motions, adapting its strides, turning angles, and footprints to laser-scanned 3D as-is environments including slopes and stairs. The DHM motion was generated based only on a motion-capture (MoCap) data for flat walking. Our implementation constructed as-is 3D environment models from laser-scanned point clouds of real environments and enabled a DHM to walk autonomously in various environment models. The difference in joint angles between the DHM and MoCap data was evaluated. Demonstrations of our environment modeling and walking simulation in indoor and outdoor environments including corridors, slopes, and stairs are illustrated in this study.

Computerized Human Body Modeling and Work Motion-capturing in a 3-D Virtual Clothing Simulation System for Painting Work Clothes Development

  • Park, Gin Ah
    • 패션비즈니스
    • /
    • 제19권3호
    • /
    • pp.130-143
    • /
    • 2015
  • By studying 3-D virtual human modeling, motion-capturing and clothing simulation for easier and safer work clothes development, this research aimed (1) to categorize heavy manufacturing work motions; (2) to generate a 3-D virtual male model and establish painting work motions within a 3-D virtual clothing simulation system through computerized body scanning and motion-capturing; and finally (3) to suggest simulated clothing images of painting work clothes developed based on virtual male avatar body measurements by implementing the work motions defined in the 3-D virtual clothing simulation system. For this, a male subject's body was 3-D scanned and also directly measured. The procedures to edit a 3-D virtual model required the total body shape to be 3-D scanned into a digital format, which was revised using 3-D Studio MAX and Maya rendering tools. In addition, heavy industry workers' work motions were observed and recorded by video camera at manufacturing sites and analyzed to categorize the painting work motions. This analysis resulted in 4 categories of motions: standing, bending, kneeling and walking. Besides, each work motion category was divided into more detailed motions according to sub-work posture factors: arm angle, arm direction, elbow bending angle, waist bending angle, waist bending direction and knee bending angle. Finally, the implementation of the painting work motions within the 3-D clothing simulation system presented the virtual painting work clothes images simulated in a dynamic mode.

A Joint Motion Planning Based on a Bio-Mimetic Approach for Human-like Finger Motion

  • Kim Byoung-Ho
    • International Journal of Control, Automation, and Systems
    • /
    • 제4권2호
    • /
    • pp.217-226
    • /
    • 2006
  • Grasping and manipulation by hands can be considered as one of inevitable functions to achieve the performances desired in humanoid operations. When a humanoid robot manipulates an object by his hands, each finger should be well-controlled to accomplish a precise manipulation of the object grasped. So, the trajectory of each joint required for a precise finger motion is fundamentally necessary to be planned stably. In this sense, this paper proposes an effective joint motion planning method for humanoid fingers. The proposed method newly employs a bio-mimetic concept for joint motion planning. A suitable model that describes an interphalangeal coordination in a human finger is suggested and incorporated into the proposed joint motion planning method. The feature of the proposed method is illustrated by simulation results. As a result, the proposed method is useful for a facilitative finger motion. It can be applied to improve the control performance of humanoid fingers or prosthetic fingers.