• Title/Summary/Keyword: 로버스트

Search Result 172, Processing Time 0.02 seconds

Pattern Recognition using Robust Feedforward Neural Networks (로버스트 다층전방향 신경망을 이용한 패턴인식)

  • Hwang, Chang-Ha;Kim, Sang-Min
    • Journal of the Korean Data and Information Science Society
    • /
    • v.9 no.2
    • /
    • pp.345-355
    • /
    • 1998
  • The back propagation(BP) algorithm allows multilayer feedforward neural networks to learn input-output mappings from training samples. It iteratively adjusts the network parameters(weights) to minimize the sum of squared approximation errors using a gradient descent technique. However, the mapping acquired through the BP algorithm may be corrupt when errorneous training data are employed. In this paper two types of robust backpropagation algorithms are discussed both from a theoretical point of view and in the case studies of nonlinear regression function estimation and handwritten Korean character recognition. For future research we suggest Bayesian learning approach to neural networks and compare it with two robust backpropagation algorithms.

  • PDF

A Comparative Study on Optimization Procedures to Robust Design (로버스트설계에서 최적화방안에 대한 비교 연구)

  • Kwon, Yong-Man;Mun, In-Suk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.11 no.1
    • /
    • pp.65-72
    • /
    • 2000
  • Robust design is an approach to reducing performance variation of quality characteristic values in quality engineering. Taguchi parameter design has a great deal of advantages but it also has some disadvantages. The various research efforts aimed at developing alternative methods. In the Taguchi parameter design, the product-array approach using orthogonal arrays is mainly used. However, it often requires an excessive number of experiments. An alternative approach, which is called the combined-array approach, was suggested by Welch et. al. (1990) and studied by others. In this paper we make a comparative study on optimization procedures to robust design in the two different experimental design(product array, combined array) approaches the Mough the Monte Carlo simulation.

  • PDF

A Criterion for the Selection of Principal Components in the Robust Principal Component Regression (로버스트주성분회귀에서 최적의 주성분선정을 위한 기준)

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.18 no.6
    • /
    • pp.761-770
    • /
    • 2011
  • Robust principal components regression is suggested to deal with both the multicollinearity and outlier problem. A main aspect of the robust principal components regression is the selection of an optimal set of principal components. Instead of the eigenvalue of the sample covariance matrix, a selection criterion is developed based on the condition index of the minimum volume ellipsoid estimator which is highly robust against leverage points. In addition, the least trimmed squares estimation is employed to cope with regression outliers. Monte Carlo simulation results indicate that the proposed criterion is superior to existing ones.

A Robust EWMA Control Chart (로버스트 지수가중 이동평균(EWMA) 관리도)

  • Nam, Ho-Soo;Lee, Byung-Gun;Joo, Cheol-Min
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.1
    • /
    • pp.233-241
    • /
    • 1999
  • Control chart is a very extensively used tool in testing whether a process is in a state of statistical control or not. In this paper, we propose a robust EWMA(exponentially weighted moving averages) control chart for variables, which is based on the Huber's M-estimator. The Huber's M-estimator is a well-known robust estimator in sense of distributional robustness. In the proposed chart, the estimation of the process deviation is modified to have a s table level and high power. To compare the performances of the proposed control chart with other charts, some Monte Carlo simulations we performed. The simulation results show that the robust EWMA control chart has good performance.

  • PDF

Robust Design of a Linear DC Motor Using Taguchi Method (다구찌 방법을 이용한 선형직류모터의 로버스트 설계)

  • 김성수;정수진;리영훈;김동희;노채균
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.15 no.4
    • /
    • pp.51-56
    • /
    • 2001
  • This papper is concerned with robust design of a linear DC motor which is steading fast in OA and FA systems due to simplicity in structure high-speed operation and high-precision positioning. The approach is based on the Taguchi method and utilizes the orthogonal way for design of experiments. In this study, first, the important factors are chosen at first. and then the concept of signal-to-nose(S/N) ratio is allied to evaluate the motor performance and each value of the design parameters is determined. This method is useful to robust design in a short time. As a result, the performance of the motor is improved.

  • PDF

Algorithm for the Robust Estimation in Logistic Regression (로지스틱회귀모형의 로버스트 추정을 위한 알고리즘)

  • Kim, Bu-Yong;Kahng, Myung-Wook;Choi, Mi-Ae
    • The Korean Journal of Applied Statistics
    • /
    • v.20 no.3
    • /
    • pp.551-559
    • /
    • 2007
  • The maximum likelihood estimation is not robust against outliers in the logistic regression. Thus we propose an algorithm for the robust estimation, which identifies the bad leverage points and vertical outliers by the V-mask type criterion, and then strives to dampen the effect of outliers. Our main finding is that, by an appropriate selection of weights and factors, we could obtain the logistic estimates with high breakdown point. The proposed algorithm is evaluated by means of the correct classification rate on the basis of real-life and artificial data sets. The results indicate that the proposed algorithm is superior to the maximum likelihood estimation in terms of the classification.

Robust group independent component analysis (로버스트 그룹 독립성분분석)

  • Kim, Hyunsung;Li, XiongZhu;Lim, Yaeji
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.2
    • /
    • pp.127-139
    • /
    • 2021
  • Independent Component Analysis is a popular statistical method to separate independent signals from the mixed data, and Group Independent Component Analysis is an its multi-subject extension of Independent Component Analysis. It has been applied Functional Magnetic Resonance Imaging data and provides promising results. However, classical Group Independent Component Analysis works poorly when outliers exist on data which is frequently occurred in Magnetic Resonance Imaging scanning. In this study, we propose a robust version of the Group Independent Component Analysis based on ROBPCA. Through the numerical studies, we compare proposed method to the conventional method, and verify the robustness of the proposed method.

On Robust Principal Component using Analysis Neural Networks (신경망을 이용한 로버스트 주성분 분석에 관한 연구)

  • Kim, Sang-Min;Oh, Kwang-Sik;Park, Hee-Joo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.7 no.1
    • /
    • pp.113-118
    • /
    • 1996
  • Principal component analysis(PCA) is an essential technique for data compression and feature extraction, and has been widely used in statistical data analysis, communication theory, pattern recognition, and image processing. Oja(1992) found that a linear neuron with constrained Hebbian learning rule can extract the principal component by using stochastic gradient ascent method. In practice real data often contain some outliers. These outliers will significantly deteriorate the performances of the PCA algorithms. In order to make PCA robust, Xu & Yuille(1995) applied statistical physics to the problem of robust principal component analysis(RPCA). Devlin et.al(1981) obtained principal components by using techniques such as M-estimation. The propose of this paper is to investigate from the statistical point of view how Xu & Yuille's(1995) RPCA works under the same simulation condition as in Devlin et.al(1981).

  • PDF

Algorithm for the L1-Regression Estimation with High Breakdown Point (L1-회귀추정량의 붕괴점 향상을 위한 알고리즘)

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.4
    • /
    • pp.541-550
    • /
    • 2010
  • The $L_1$-regression estimator is susceptible to the leverage points, even though it is highly robust to the vertical outliers. This article is concerned with the improvement of robustness of the $L_1$-estimator. To improve its robustness, in terms of the breakdown point, we attempt to dampen the influence of the leverage points by means of reducing the weights corresponding to the leverage points. In addition the algorithm employs the linear scaling transformation technique, for higher computational efficiency with the large data sets, to solve the linear programming problem of $L_1$-estimation. Monte Carlo simulation results indicate that the proposed algorithm yields $L_1$-estimates which are robust to the leverage points as well as the vertical outliers.

On Confidence Intervals of Robust Regression Estimators (로버스트 회귀추정에 의한 신뢰구간 구축)

  • Lee Dong-Hee;Park You-Sung;Kim Kee-Whan
    • The Korean Journal of Applied Statistics
    • /
    • v.19 no.1
    • /
    • pp.97-110
    • /
    • 2006
  • Since it is well-established that even high quality data tend to contain outliers, one would expect fat? greater reliance on robust regression techniques than is actually observed. But most of all robust regression estimators suffers from the computational difficulties and the lower efficiency than the least squares under the normal error model. The weighted self-tuning estimator (WSTE) recently suggested by Lee (2004) has no more computational difficulty and it has the asymptotic normality and the high break-down point simultaneously. Although it has better properties than the other robust estimators, WSTE does not have full efficiency under the normal error model through the weighted least squares which is widely used. This paper introduces a new approach as called the reweighted WSTE (RWSTE), whose scale estimator is adaptively estimated by the self-tuning constant. A Monte Carlo study shows that new approach has better behavior than the general weighted least squares method under the normal model and the large data.