• Title/Summary/Keyword: multimodal input

Search Result 34, Processing Time 0.029 seconds

A Triple Residual Multiscale Fully Convolutional Network Model for Multimodal Infant Brain MRI Segmentation

  • Chen, Yunjie;Qin, Yuhang;Jin, Zilong;Fan, Zhiyong;Cai, Mao
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.3
    • /
    • pp.962-975
    • /
    • 2020
  • The accurate segmentation of infant brain MR image into white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF) is very important for early studying of brain growing patterns and morphological changes in neurodevelopmental disorders. Because of inherent myelination and maturation process, the WM and GM of babies (between 6 and 9 months of age) exhibit similar intensity levels in both T1-weighted (T1w) and T2-weighted (T2w) MR images in the isointense phase, which makes brain tissue segmentation very difficult. We propose a deep network architecture based on U-Net, called Triple Residual Multiscale Fully Convolutional Network (TRMFCN), whose structure exists three gates of input and inserts two blocks: residual multiscale block and concatenate block. We solved some difficulties and completed the segmentation task with the model. Our model outperforms the U-Net and some cutting-edge deep networks based on U-Net in evaluation of WM, GM and CSF. The data set we used for training and testing comes from iSeg-2017 challenge (http://iseg2017.web.unc.edu).

Deep Learning-Based Companion Animal Abnormal Behavior Detection Service Using Image and Sensor Data

  • Lee, JI-Hoon;Shin, Min-Chan;Park, Jun-Hee;Moon, Nam-Mee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.10
    • /
    • pp.1-9
    • /
    • 2022
  • In this paper, we propose the Deep Learning-Based Companion Animal Abnormal Behavior Detection Service, which using video and sensor data. Due to the recent increase in households with companion animals, the pet tech industry with artificial intelligence is growing in the existing food and medical-oriented companion animal market. In this study, companion animal behavior was classified and abnormal behavior was detected based on a deep learning model using various data for health management of companion animals through artificial intelligence. Video data and sensor data of companion animals are collected using CCTV and the manufactured pet wearable device, and used as input data for the model. Image data was processed by combining the YOLO(You Only Look Once) model and DeepLabCut for extracting joint coordinates to detect companion animal objects for behavior classification. Also, in order to process sensor data, GAT(Graph Attention Network), which can identify the correlation and characteristics of each sensor, was used.

Multicontents Integrated Image Animation within Synthesis for Hiqh Quality Multimodal Video (고화질 멀티 모달 영상 합성을 통한 다중 콘텐츠 통합 애니메이션 방법)

  • Jae Seung Roh;Jinbeom Kang
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.4
    • /
    • pp.257-269
    • /
    • 2023
  • There is currently a burgeoning demand for image synthesis from photos and videos using deep learning models. Existing video synthesis models solely extract motion information from the provided video to generate animation effects on photos. However, these synthesis models encounter challenges in achieving accurate lip synchronization with the audio and maintaining the image quality of the synthesized output. To tackle these issues, this paper introduces a novel framework based on an image animation approach. Within this framework, upon receiving a photo, a video, and audio input, it produces an output that not only retains the unique characteristics of the individuals in the photo but also synchronizes their movements with the provided video, achieving lip synchronization with the audio. Furthermore, a super-resolution model is employed to enhance the quality and resolution of the synthesized output.

Specifying the Characteristics of Tangible User Interface: centered on the Science Museum Installation (실물형 인터렉션 디자인 특성 분석: 과학관 체험 전시물을 대상으로)

  • Cho, Myung Eun;Oh, Myung Won;Kim, Mi Jeong
    • Science of Emotion and Sensibility
    • /
    • v.15 no.4
    • /
    • pp.553-564
    • /
    • 2012
  • Tangible user interfaces have been developed in the area of Human-Computer Interaction for the last decades, however, the applied domains recently have been extended into the product design and interactive art. Tangible User Interfaces are the combination of digital information and physical objects or environments, thus they provide tangible and intuitive interaction as input and output devices, often combined with Augmented Reality. The research developed a design guideline for tangible user interfaces based on key properties of tangible user interfaces defined previously in five representative research: Tangible Interaction, Intuitiveness and Convenience, Expressive Representation, Context-aware and Spatial Interaction, and Social Interaction. Using the guideline emphasizing user interaction, this research evaluated installation in a science museum in terms of the applied characteristics of tangible user interfaces. The selected 15 installations which were evaluated are to educate visitors for science by emphasizing manipulation and experience of interfaces in those installations. According to the input devices, they are categorized into four Types. TUI properties in Type 3 installation, which uses body motions for interaction, shows the highest score, where items for context-aware and spatial interaction were highly rated. The context-aware and spatial interaction have been recently emphasized as extended properties of tangible user interfaces. The major type of installation in the science museum is equipped with buttons and joysticks for physical manipulation, thus multimodal interfaces utilizing visual, aural, tactile senses etc need to be developed to provide more innovative interaction. Further, more installation need to be reconfigurable for embodied interaction between users and the interactive space. The proposed design guideline can specify the characteristics of tangible user interfaces, thus this research can be a basis for the development and application of installation involving more TUI properties in future.

  • PDF