• Title/Summary/Keyword: PAC-Bayesian

Search Result 1, Processing Time 0.013 seconds

MARGIN-BASED GENERALIZATION FOR CLASSIFICATIONS WITH INPUT NOISE

  • Choe, Hi Jun;Koh, Hayeong;Lee, Jimin
    • Journal of the Korean Mathematical Society
    • /
    • v.59 no.2
    • /
    • pp.217-233
    • /
    • 2022
  • Although machine learning shows state-of-the-art performance in a variety of fields, it is short a theoretical understanding of how machine learning works. Recently, theoretical approaches are actively being studied, and there are results for one of them, margin and its distribution. In this paper, especially we focused on the role of margin in the perturbations of inputs and parameters. We show a generalization bound for two cases, a linear model for binary classification and neural networks for multi-classification, when the inputs have normal distributed random noises. The additional generalization term caused by random noises is related to margin and exponentially inversely proportional to the noise level for binary classification. And in neural networks, the additional generalization term depends on (input dimension) × (norms of input and weights). For these results, we used the PAC-Bayesian framework. This paper is considering random noises and margin together, and it will be helpful to a better understanding of model sensitivity and the construction of robust generalization.