• Title/Summary/Keyword: avatar interaction

Search Result 35, Processing Time 0.017 seconds

Effect of Allyl Modified/Silane Modified Multiwalled Carbon Nano Tubes on the Electrical Properties of Unsaturated Polyester Resin Composites

  • Swain, Sarojini;Sharma, Ram Avatar;Patil, Sandip;Bhattacharya, Subhendu;Gadiyaram, Srinivasa Pavan;Chaudhari, Lokesh
    • Transactions on Electrical and Electronic Materials
    • /
    • v.13 no.6
    • /
    • pp.267-272
    • /
    • 2012
  • Considering the properties of the carbon nano tubes (CNT), their inclusion into the polymer matrix vastly increases the properties of the resultant composite. However, this is not the case due to the poor interfacial adhesion of the CNT and the polymer matrix. The present approach focuses on increasing the interaction between the polymer matrix and the CNT through the chemical modification of the CNT resulting in allyl ester functionalized carbon nanotubes (ACNT) and silane functionalized carbon nano tubes (SCNT) which are capable of reacting with the polymer matrix during the curing reaction. The addition of ACNT/SCNT into unsaturated polyester resin (UPR) resulted in the improvement of the electrical properties of resulted nanocomposites in comparison to the CNT. The surface resistivity, volume resistivity, dielectric strength, dry arc resistivity, and the comparative tracking index of the nanocomposites were significantly improved in comparison to CNT. The chemical modification of CNT was confirmed via spectroscopy.

Study on Use the Metaverse Platform in Fashion Design (패션디자인 분야의 메타버스 플랫폼 활용 연구)

  • Ryu, Kyoung ok
    • Journal of the Korea Fashion and Costume Design Association
    • /
    • v.25 no.2
    • /
    • pp.31-44
    • /
    • 2023
  • Fashion design in the metaverse is not simply a 3D avatar or virtual fashion, it is an important clue for shopping trends, and the role of fashion design has grown even more because avatars and humans are identified and active. This study, I attempts to understand the metaverse platform accurately and find out the scope of fashion design within the metaverse platform. In addition, we want to provide basic data that can expand the field by using fashion design in various ways on the metabus platform. This study investigated and analyzed various metaverse fashion cases, articles, software, and methods used by metaverse fashion creators, and the results are as follows. First, the metaverse platform is a new level of virtual interaction where users and creators communicate through the convergence of augmented reality, lifelogging, mirror world, and virtual world. Second, most of the users of the metaverse platform are generation Z, and metaverse creators who make money by producing avatars or items, including fashion design, are emerging as a new job field. Third, many fashion brands created spaces on the Metaverse platform, collaborated with games, or opened fashion weeks for publicity, marketing, and sales. Fourth, as a 3D program for metaverse fashion creator activities, open-source software is easier and free of charge compared to programs for fashion design specialists, and most costumes can be reproduced, so it will be easier for fashion design majors to utilize.

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF

Real-time Interactive Animation System for Low-Priced Motion Capture Sensors (저가형 모션 캡처 장비를 이용한 실시간 상호작용 애니메이션 시스템)

  • Kim, Jeongho;Kang, Daeun;Lee, Yoonsang;Kwon, Taesoo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.28 no.2
    • /
    • pp.29-41
    • /
    • 2022
  • In this paper, we introduce a novel real-time, interactive animation system which uses real-time motion inputs from a low-cost motion-sensing device Kinect. Our system generates interaction motions between the user character and the counterpart character in real-time. While the motion of the user character is generated mimicking the user's input motion, the other character's motion is decided to react to the user avatar's motion. During a pre-processing step, our system analyzes the reference motion data and generates mapping model in advance. At run-time, our system first generates initial poses of two characters and then modifies them so that it could provide plausible interacting behavior. Our experimental results show plausible interacting animations in that the user character performs a modified motion of user input and the counterpart character properly reacts against the user character. The proposed method will be useful for developing real-time interactive animation systems which provide a better immersive experience for users.

A Study on The Metaverse Content Production Pipeline using ZEPETO World (제페토 월드를 활용한 메타버스 콘텐츠 제작 공정에 관한 연구)

  • Park, MyeongSeok;Cho, Yunsik;Cho, Dasom;Na, Giri;Lee, Jamin;Cho, Sae-Hong;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.28 no.3
    • /
    • pp.91-100
    • /
    • 2022
  • This study proposes the metaverse content production pipeline using ZEPETO World, one of the representative metaverse platforms in Korea. Based on the Unity 3D engine, the ZEPETO world is configured using the ZEPETO template, and the core functions of the metaverse content that enable multi-user participation such as logic, interaction, and property control are implemented through the ZEPETO script. This study utilizes the basic functions such as properties, events, and components of the ZEPETO script as well as the ZEPETO player which includes avatar loading, character movement, and camera control functions. In addition, based on ZEPETO's properties such as World Multiplayer and Client Starter, it summarizes the core synchronization process required for multiplay metaverse content production, such as object transformation, dynamic object creation, property addition, and real-time property control. Based on this, we check the proposed production pipeline by directly producing multiplay metaverse content using ZEPETO World.