• Title/Summary/Keyword: Interactive Animation Interface

Search Result 22, Processing Time 0.026 seconds

극소형 전자기계장치에 관한 연구전망

  • 양상식
    • 전기의세계
    • /
    • v.39 no.6
    • /
    • pp.14-19
    • /
    • 1990
  • 1. CAD system과 PROPS를 접속하여 CADsystem에서 Design된 surface를 사용할 수 있으며 Robot Kinematics를 graphic library화하여 surface배치 수상 및 path generation 및 animation을 통하여 가공작업을 위한 로보트 운동을 simulation할 수 있게 되었다. 2. Denavit-hartenberg transformation form에 의해 여러 Robot Kinematic을 일반적인 형식으로 library화 하였다. 3. 금형 가공의 공정들을 Menu로 만들어서 Expert system을 도입, 손쉽게 Interactive한 작업을 할 수 있게 하였다. 4. 차후의 연구 목표는 로보트 Calibration S/W의 개발 및 실현 그리고 Expert System을 이용한 Robot Program Generator의 완성을 통한 전체 Off-line programming System을 정립하는데 있다. 이를 위해서 더 실제적인 Tool Path Generation과 Expert System을 이용한 가공 조건의 결정 및 User Interface를 위한 Window가 개발되어야 한다. 5. 1차년도에 개발된 Robotonomic Tool System의 유연성을 확장시킨다. 실험결과를 바탕으로 공정 자동화 시스템을 확장시킨다. 6. 연마공정자동화에 필수적인 공구 및 공구 Tip의 표준화 및 자동교환장치를 개발한다. 7. 금형연마 Cell의 구성요소들간의 Interface 및 System Controller에서의 집적화를 시킨다.

  • PDF

A Novel Interactive Power Electronics Seminar (iPES) Developed at the Swiss Federal Institute of Technology (ETH) Zurich

  • Drofenik, Uwe;Kolar, Johann W.
    • Journal of Power Electronics
    • /
    • v.2 no.4
    • /
    • pp.250-257
    • /
    • 2002
  • This paper introduces the Interactive Power Electronics Seminar - iPES - a new software package for teaching of fundamentals of power electronic circuits and systems. iPES is constituted by HTML text with Java applets for interactive animation, circuit design and simulation and visualization of electromagnetic fields and thermal issues in power electronics. It does comprise an easy-to-use self-explaining graphical user interface. The software does need just a standard web-browser, i.e. no installations are required. iPES can be accessed via the World Wide Web or from a CD-ROM in a stand-alone PC by students and professionals. Due to the underlying software technology iPES is very flexible and could be used for on-line learning and could easily be integrated into an e-learning platform. The aim of this paper Is to give an introduction to the iPES-project and to show the different areas covered. The e- learning software is available at no costs at $\underline{www.ipes.ethz.ch}$ in English, German, Japanese, Korean, Chinese and Spanish. The project is still under development and the web page is updated in about 4 weeks intervals.

Interactive Facial Expression Animation of Motion Data using Sammon's Mapping (Sammon 매핑을 사용한 모션 데이터의 대화식 표정 애니메이션)

  • Kim, Sung-Ho
    • The KIPS Transactions:PartA
    • /
    • v.11A no.2
    • /
    • pp.189-194
    • /
    • 2004
  • This paper describes method to distribute much high-dimensional facial expression motion data to 2 dimensional space, and method to create facial expression animation by select expressions that want by realtime as animator navigates this space. In this paper composed expression space using about 2400 facial expression frames. The creation of facial space is ended by decision of shortest distance between any two expressions. The expression space as manifold space expresses approximately distance between two points as following. After define expression state vector that express state of each expression using distance matrix which represent distance between any markers, if two expression adjoin, regard this as approximate about shortest distance between two expressions. So, if adjacency distance is decided between adjacency expressions, connect these adjacency distances and yield shortest distance between any two expression states, use Floyd algorithm for this. To materialize expression space that is high-dimensional space, project on 2 dimensions using Sammon's Mapping. Facial animation create by realtime with animators navigating 2 dimensional space using user interface.

Interactive Realtime Facial Animation with Motion Data (모션 데이터를 사용한 대화식 실시간 얼굴 애니메이션)

  • 김성호
    • Journal of the Korea Computer Industry Society
    • /
    • v.4 no.4
    • /
    • pp.569-578
    • /
    • 2003
  • This paper presents a method in which the user produces a real-time facial animation by navigating in the space of facial expressions created from a great number of captured facial expressions. The core of the method is define the distance between each facial expressions and how to distribute into suitable intuitive space using it and user interface to generate realtime facial expression animation in this space. We created the search space from about 2,400 raptured facial expression frames. And, when the user free travels through the space, facial expressions located on the path are displayed in sequence. To visually distribute about 2,400 captured racial expressions in the space, we need to calculate distance between each frames. And we use Floyd's algorithm to get all-pairs shortest path between each frames, then get the manifold distance using it. The distribution of frames in intuitive space apply a multi-dimensional scaling using manifold distance of facial expression frames, and distributed in 2D space. We distributed into intuitive space with keep distance between facial expression frames in the original form. So, The method presented at this paper has large advantage that free navigate and not limited into intuitive space to generate facial expression animation because of always existing the facial expression frames to navigate by user. Also, It is very efficient that confirm and regenerate nth realtime generation using user interface easy to use for facial expression animation user want.

  • PDF

Augmented Reality based Interactive Storyboard System (증강현실 기반의 인터랙티브 스토리보드 제작 시스템)

  • Park, Jun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.13 no.2
    • /
    • pp.17-22
    • /
    • 2007
  • In early stages of film or animation production, storyboard is used to visually describe the outline of a story. Drawings or photographs, as well as the texts, are employed for character / item placements and camera pose. However, commercially available storyboard tools are mainly drawing and editing tools, not providing functionality for item placement and camera control. In this paper, an Augmented Reality based storyboard tool is presented, which provides an intuitive and easy-to-use interface for storyboard development. Using the presented tool, non-expert users may compose 30 scenes in his or her real environments through tangible building blocks which are used to fetch corresponding 3D models and their pose.

  • PDF

Real-time Shape Manipulation using Deformable Curve-Skeleton

  • Sohn, Eisung
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.4
    • /
    • pp.491-501
    • /
    • 2019
  • Variational methods, which cast deformation as an energy-minimization problem, are known to provide a good trade-off between practicality and speed. However, the time required to deform a fully detailed shape means that these methods are largely unsuitable for real-time applications. We simplify a 2D shape into a curve skeleton, which can be deformed much more rapidly than the original shape. The curve skeleton also provides a simplified control for the user, utilizing a small number of control handles. Our system deforms the curve skeleton using an energy-minimization method and then applies the resulting deformation to the original shape using linear blend skinning. This approach effectively reduces the size of the variational optimization problem while producing deformations of a similar quality to those obtained from full-scale nonlinear variational methods.

Interactive Facial Expression Animation of Motion Data using CCA (CCA 투영기법을 사용한 모션 데이터의 대화식 얼굴 표정 애니메이션)

  • Kim Sung-Ho
    • Journal of Internet Computing and Services
    • /
    • v.6 no.1
    • /
    • pp.85-93
    • /
    • 2005
  • This paper describes how to distribute high multi-dimensional facial expression data of vast quantity over a suitable space and produce facial expression animations by selecting expressions while animator navigates this space in real-time. We have constructed facial spaces by using about 2400 facial expression frames on this paper. These facial spaces are created by calculating of the shortest distance between two random expressions. The distance between two points In the space of expression, which is manifold space, is described approximately as following; When the linear distance of them is shorter than a decided value, if the two expressions are adjacent after defining the expression state vector of facial status using distance matrix expressing distance between two markers, this will be considered as the shortest distance (manifold distance) of the two expressions. Once the distance of those adjacent expressions was decided, We have taken a Floyd algorithm connecting these adjacent distances to yield the shortest distance of the two expressions. We have used CCA(Curvilinear Component Analysis) technique to visualize multi-dimensional spaces, the form of expressing space, into two dimensions. While the animators navigate this two dimensional spaces, they produce a facial animation by using user interface in real-time.

  • PDF

Development and Evaluation of e-EBPP(Evidence-Based Practice Protocol) System for Evidence-Based Dementia Nursing Practice (근거중심 치매 간호실무를 위한 e-EBPP 시스템 개발 및 평가)

  • Park, Myonghwa
    • Korean Journal of Adult Nursing
    • /
    • v.17 no.3
    • /
    • pp.411-424
    • /
    • 2005
  • Purpose: The purpose of this study was to develop and evaluate e-EBPP(Evidence-based Practice Protocol) system for nursing care for patients with dementia to facilitate the best evidence-based decision in their dementia care settings. Method: The system was developed based on system development life cycle and software prototyping using the following 5 processes: Analysis, Planning, Developing, Program Operation, and Final Evaluation. Result: The system consisted of modules for evidence-based nursing and protocol, guide for developing protocol, tool for saving, revising, and deleting the protocol, interface tool among users, and tool for evaluating users' satisfaction of the system. On the main page, there were 7 menu bars that consisted of Introduction of site, EBN info, Dementia info, Evidence Based Practice Protocol, Protocol Bank, Community, and Site Link. In the operation of the system, HTML, JavaScript, and Flash were utilized and the content consisted of text content, interactive content, animation, and quiz. Conclusion: This system can support nurses' best and cost-effective clinical decision using sharable standardized protocols consisting of the best evidence in dementia care. In addition, it can be utilized as an e-learning program for nurses and nursing students to learn use of evidence based information.

  • PDF

Study on the Design and Usability factor analysis of Web Multimedia Contents (웹 멀티미디어 컨텐츠의 디자인과 유용성분석에 대한 연구)

  • Koh, Eun-Young;Shin, Soon-Ho
    • Archives of design research
    • /
    • v.17 no.4
    • /
    • pp.69-78
    • /
    • 2004
  • This thesis is designed to investigate the relationship of Web Multimedia design. Web sites are needed to approach the lot of ways that increase the user intuitive lay over communicating information nowadays. In the digital era of today, multimedia is composed of such various elements as text, image, sound, animation, video. The method of the thesis is a research questions as this follow. First, the structuring element of the web site type was classified by a form and the contents in the web site. These types in this classified web site are HTML, flash and mixed type. Second, to study this thesis, we made up pose a questionnaire to students in the university as th method of random sampling. The goal of this survey is that user is how to understand about the web multimedia design. Results of analysis may be summarized as follows: 1) It was the image design in the NIKE web site that there would be evaluated the best design effected by flash, and it is finding that the NIDE web site will need more supporting Information, Interactive design through this studies. 2) It was information that there would be evaluated good design effect by text and image in SAMSUNG web site. Based on HTML, SAMSUNG need supporting motion image design by this studies. In conclusion, this thesis suggests that the each web medias should be constructed by the different web design component, according to information, Interactive, image design of web site.

  • PDF

Development of A News Event Reenactment System (사건재연 시스템 개발)

  • 윤여천;변혜원;전성규;박창섭
    • Journal of Broadcast Engineering
    • /
    • v.7 no.1
    • /
    • pp.21-27
    • /
    • 2002
  • This paper presents a mews event reenactment system (NERS), which generates virtual character animations in a quick and convenient manner. Thus, NERS can be used to produce computer graphics(CG) scenes of news events that are hard to photograph, such as fire, traffic accident, cases of murder, and so on. By using plenty of captured motion data and CG model data this system produces an appropriate animation of virtual characters straightforwardly without any motion capturing device and actors in the authoring stage. NERS is designed to be capable of making virtual characters move along user-defined paths, stitching motions smoothly and modifyingthe positions and of the articulations of a virtual character in a specific frame. Therefore a virtual character can be controlled precisely so as to interact with the virtual environments and other characters. NERS provides both an interactive and script-based (MEL: Maya Embedded Language) interface so that user can this system in a convenient way. This system has been implemented as a plug-in of commercial CG tool, Maya (Alias/wavefront), in order to make use of its advanced functions