• Title/Summary/Keyword: 오래된 그래디언트

Search Result 1, Processing Time 0.013 seconds

Dynamic Window Adjustment and Model Stability Improvement Algorithm for K-Asynchronous Federated Learning (K-비동기식 연합학습의 동적 윈도우 조절과 모델 안정성 향상 알고리즘)

  • HyoSang Kim;Taejoon Kim
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.28 no.4
    • /
    • pp.21-34
    • /
    • 2023
  • Federated Learning is divided into synchronous federated learning and asynchronous federated learning. Asynchronous federated learning has a time advantage over synchronous federated learning, but asynchronous federated learning still has some challenges to obtain better performance. In particular, preventing performance degradation in non-IID training datasets, selecting appropriate clients, and managing stale gradient information are important for improving model performance. In this paper, we deal with K-asynchronous federated learning by using non-IID datasets. In addition, unlike traditional method using static K, we proposed an algorithm that adaptively adjusts K and we can reduce the learning time. Additionally, the we show that model performance is improved by using stale gradient handling method. Finally, we use a method of judging model performance to obtain strong model stability. Experiment results show that overall algorithm can obtain advantages of reducing training time, improving model accuracy, and improving model stability.