• Title/Summary/Keyword: 계층적 다중 작업 학습

Search Result 3, Processing Time 0.018 seconds

Hierarchical multi-task learning with self-supervised auxiliary task (HiSS: 자기 지도 보조 작업을 결합한 계층적 다중 작업 학습)

  • Seunghan Lee;Taeyoung Park
    • The Korean Journal of Applied Statistics
    • /
    • v.37 no.5
    • /
    • pp.631-641
    • /
    • 2024
  • Multi-task learning is a popular approach in machine learning that aims to learn multiple related tasks simultaneously by sharing information across them. In this paper, we consider a hierarchical structure across multiple related tasks with a hierarchy of sub-tasks under the same main task, where representations used to solve the sub-tasks share more information through task-specific layers, globally shared layers, and locally shared layers. We thus propose the hierarchical multi-task learning with self-supervised auxiliary task (HiSS), which is a novel approach for hierarchical multi-task learning that incorporates self-supervised learning as an auxiliary task. The goal of the auxiliary task is to further extract latent information from the unlabeled data by predicting a cluster label directly derived from the data. The proposed approach is tested on the Hyodoll dataset, which consists of user information and activity logs of elderly individuals collected by AI companion robots, for predicting emergency calls based on the time of day and month. Our proposed algorithm is more efficient than other well-known machine learning algorithms as it requires only a single model regardless of the number of tasks, and demonstrates superior performance in classification tasks using various metrics. The source codes are available at: https://github.com/seunghan96/HiSS.

Performance Comparison Analysis on Named Entity Recognition system with Bi-LSTM based Multi-task Learning (다중작업학습 기법을 적용한 Bi-LSTM 개체명 인식 시스템 성능 비교 분석)

  • Kim, GyeongMin;Han, Seunggnyu;Oh, Dongsuk;Lim, HeuiSeok
    • Journal of Digital Convergence
    • /
    • v.17 no.12
    • /
    • pp.243-248
    • /
    • 2019
  • Multi-Task Learning(MTL) is a training method that trains a single neural network with multiple tasks influences each other. In this paper, we compare performance of MTL Named entity recognition(NER) model trained with Korean traditional culture corpus and other NER model. In training process, each Bi-LSTM layer of Part of speech tagging(POS-tagging) and NER are propagated from a Bi-LSTM layer to obtain the joint loss. As a result, the MTL based Bi-LSTM model shows 1.1%~4.6% performance improvement compared to single Bi-LSTM models.

Claim-Evidence Pair Extraction Model using Hierarchical Label Embedding (계층적 레이블 임베딩을 이용한 주장-증거 쌍 추출 모델)

  • Yujin Sim;Damrin Kim;Tae-il Kim;Sung-won Choi;Harksoo Kim
    • Annual Conference on Human and Language Technology
    • /
    • 2023.10a
    • /
    • pp.474-478
    • /
    • 2023
  • 논증 마이닝이란 비정형의 텍스트 데이터에서 논증 구조와 그 요소들을 식별, 분석, 추출하는 자연어 처리의 한 분야다. 논증 마이닝의 하위 작업인 주장-증거 쌍 추출은 주어진 문서에서 자동으로 주장과 증거 쌍을 추출하는 작업이다. 본 논문에서는 효과적인 주장-증거 쌍 추출을 위해, 문서 단위의 문맥 정보를 이용하고 주장과 증거 간의 종속성을 반영하기 위한 계층적 LAN 방법을 제안한다. 실험을 통해 서로의 정보를 활용하는 종속적인 구조가 독립적인 구조보다 우수함을 입증하였으며, 최종 제안 모델은 Macro F1을 기준으로 13.5%의 성능 향상을 보였다.

  • PDF