• Title/Summary/Keyword: k-Nearest Neighbor Classification

Search Result 187, Processing Time 0.028 seconds

Detection and Classification of Bearing Flaking Defects by Using Kullback Discrimination Information (KDI)

  • Kim, Tae-Gu;Takabumi Fukuda;Hisaji Shimizu
    • International Journal of Safety
    • /
    • v.1 no.1
    • /
    • pp.28-35
    • /
    • 2002
  • Kullback Discrimination Information (KDI) is one of the pattern recognition methods. KDI defined as a measure of the mutual dissimilarity computed between two time series was studied for detection and classification of bearing flaking on outer-race and inner-races. To model the damages, the bearings in normal condition, outer-race flaking condition and inner-races flaking condition were provided. The vibration sensor was attached by the bearing housing. This produced the total 25 pieces of data each condition, and we chose the standard data and measure of distance between standard and tested data. It is difficult to detect the flaking because similar pulses come out when balls pass the defection point. The detection and classification method for inner and outer races are defected by KDI and nearest neighbor classification rule is proposed and its high performance is also shown.

A Low Complexity PTS Technique using Threshold for PAPR Reduction in OFDM Systems

  • Lim, Dai Hwan;Rhee, Byung Ho
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.9
    • /
    • pp.2191-2201
    • /
    • 2012
  • Traffic classification seeks to assign packet flows to an appropriate quality of service (QoS) class based on flow statistics without the need to examine packet payloads. Classification proceeds in two steps. Classification rules are first built by analyzing traffic traces, and then the classification rules are evaluated using test data. In this paper, we use self-organizing map and K-means clustering as unsupervised machine learning methods to identify the inherent classes in traffic traces. Three clusters were discovered, corresponding to transactional, bulk data transfer, and interactive applications. The K-nearest neighbor classifier was found to be highly accurate for the traffic data and significantly better compared to a minimum mean distance classifier.

A Study on Fault Detection and Diagnosis of Gear Damages - A Comparison between Wavelet Transform Analysis and Kullback Discrimination Information - (기어의 이상검지 및 진단에 관한 연구 -Wavelet Transform해석과 KDI의 비교-)

  • Kim, Tae-Gu;Kim, Kwang-Il
    • Journal of the Korean Society of Safety
    • /
    • v.15 no.2
    • /
    • pp.1-7
    • /
    • 2000
  • This paper presents the approach involving fault detection and diagnosis of gears using pattern recognition and Wavelet transform. It describes result of the comparison between KDI (Kullback Discrimination Information) with the nearest neighbor classification rule as one of pattern recognition methods and Wavelet transform to know a way to detect and diagnosis of gear damages experimentally. To model the damages 1) Normal (no defect), 2) one tooth is worn out, 3) All teeth faces are worn out 4) One tooth is broken. The vibration sensor was attached on the bearing housing. This produced the total time history data that is 20 pieces of each condition. We chose the standard data and measure distance between standard and tested data. In Wavelet transform analysis method, the time series data of magnitude in specified frequency (rotary and mesh frequency) were earned. As a result, the monitoring system using Wavelet transform method and KDI with nearest neighbor classification rule successfully detected and classified the damages from the experimental data.

  • PDF

Adaptive Nearest Neighbors for Classification (Adaptive Nearest Neighbors를 활용한 판별분류방법)

  • Jhun, Myoung-Shic;Choi, In-Kyung
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.3
    • /
    • pp.479-488
    • /
    • 2009
  • The ${\kappa}$-Nearest Neighbors Classification(KNNC) is a popular non-parametric classification method which assigns a fixed number ${\kappa}$ of neighbors to every observation without consideration of the local feature of the each observation. In this paper, we propose an Adaptive Nearest Neighbors Classification(ANNC) as an alternative to KNNC. The proposed ANNC method adapts the number of neighbors according to the local feature of the observation such as density of data. To verify characteristics of ANNC, we compare the number of misclassified observation with KNNC by Monte Carlo study and confirm the potential performance of ANNC method.

Cross platform classification of microarrays by rank comparison

  • Lee, Sunho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.2
    • /
    • pp.475-486
    • /
    • 2015
  • Mining the microarray data accumulated in the public data repositories can save experimental cost and time and provide valuable biomedical information. Big data analysis pooling multiple data sets increases statistical power, improves the reliability of the results, and reduces the specific bias of the individual study. However, integrating several data sets from different studies is needed to deal with many problems. In this study, I limited the focus to the cross platform classification that the platform of a testing sample is different from the platform of a training set, and suggested a simple classification method based on rank. This method is compared with the diagonal linear discriminant analysis, k nearest neighbor method and support vector machine using the cross platform real example data sets of two cancers.

Feature Selection for Multiple K-Nearest Neighbor classifiers using GAVaPS (GAVaPS를 이용한 다수 K-Nearest Neighbor classifier들의 Feature 선택)

  • Lee, Hee-Sung;Lee, Jae-Hun;Kim, Eun-Tai
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.6
    • /
    • pp.871-875
    • /
    • 2008
  • This paper deals with the feature selection for multiple k-nearest neighbor (k-NN) classifiers using Genetic Algorithm with Varying reputation Size (GAVaPS). Because we use multiple k-NN classifiers, the feature selection problem for them is vary hard and has large search region. To solve this problem, we employ the GAVaPS which outperforms comparison with simple genetic algorithm (SGA). Further, we propose the efficient combining method for multiple k-NN classifiers using GAVaPS. Experiments are performed to demonstrate the efficiency of the proposed method.

Mapping Burned Forests Using a k-Nearest Neighbors Classifier in Complex Land Cover (k-Nearest Neighbors 분류기를 이용한 복합 지표 산불피해 영역 탐지)

  • Lee, Hanna ;Yun, Konghyun;Kim, Gihong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.43 no.6
    • /
    • pp.883-896
    • /
    • 2023
  • As human activities in Korea are spread throughout the mountains, forest fires often affect residential areas, infrastructure, and other facilities. Hence, it is necessary to detect fire-damaged areas quickly to enable support and recovery. Remote sensing is the most efficient tool for this purpose. Fire damage detection experiments were conducted on the east coast of Korea. Because this area comprises a mixture of forest and artificial land cover, data with low resolution are not suitable. We used Sentinel-2 multispectral instrument (MSI) data, which provide adequate temporal and spatial resolution, and the k-nearest neighbor (kNN) algorithm in this study. Six bands of Sentinel-2 MSI and two indices of normalized difference vegetation index (NDVI) and normalized burn ratio (NBR) were used as features for kNN classification. The kNN classifier was trained using 2,000 randomly selected samples in the fire-damaged and undamaged areas. Outliers were removed and a forest type map was used to improve classification performance. Numerous experiments for various neighbors for kNN and feature combinations have been conducted using bi-temporal and uni-temporal approaches. The bi-temporal classification performed better than the uni-temporal classification. However, the uni-temporal classification was able to detect severely damaged areas.

An Approach of Dimension Reduction in k-Nearest Neighbor Based Short-term Load Forecasting

  • Chu, FaZheng;Jung, Sung-Hwan
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.9
    • /
    • pp.1567-1573
    • /
    • 2017
  • The k-nearest neighbor (k-NN) algorithm is one of the most widely used benchmark algorithm in classification. Nowadays it has been further applied to predict time series. However, one of the main concerns of the algorithm applied on short-term electricity load forecasting is high computational burden. In the paper, we propose an approach of dimension reduction that follows the principles of highlighting the temperature effect on electricity load data series. The results show the proposed approach is able to reduce the dimension of the data around 30%. Moreover, with temperature effect highlighting, the approach will contribute to finding similar days accurately, and then raise forecasting accuracy slightly.

Fuzzy Learning Vector Quantization based on Fuzzy k-Nearest Neighbor Prototypes

  • Roh, Seok-Beom;Jeong, Ji-Won;Ahn, Tae-Chon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.11 no.2
    • /
    • pp.84-88
    • /
    • 2011
  • In this paper, a new competition strategy for learning vector quantization is proposed. The simple competitive strategy used for learning vector quantization moves the winning prototype which is the closest to the newly given data pattern. We propose a new learning strategy based on k-nearest neighbor prototypes as the winning prototypes. The selection of several prototypes as the winning prototypes guarantees that the updating process occurs more frequently. The design is illustrated with the aid of numeric examples that provide a detailed insight into the performance of the proposed learning strategy.

Acoustic Emission Source Classification of Finite-width Plate with a Circular Hole Defect using k-Nearest Neighbor Algorithm (k-최근접 이웃 알고리즘을 이용한 원공결함을 갖는 유한 폭 판재의 음향방출 음원분류에 대한 연구)

  • Rhee, Zhang-Kyu;Oh, Jin-Soo
    • Journal of the Korea Safety Management & Science
    • /
    • v.11 no.1
    • /
    • pp.27-33
    • /
    • 2009
  • A study of fracture to material is getting interest in nuclear and aerospace industry as a viewpoint of safety. Acoustic emission (AE) is a non-destructive testing and new technology to evaluate safety on structures. In previous research continuously, all tensile tests on the pre-defected coupons were performed using the universal testing machine, which machine crosshead was move at a constant speed of 5mm/min. This study is to evaluate an AE source characterization of SM45C steel by using k-nearest neighbor classifier, k-NNC. For this, we used K-means clustering as an unsupervised learning method for obtained multi -variate AE main data sets, and we applied k-NNC as a supervised learning pattern recognition algorithm for obtained multi-variate AE working data sets. As a result, the criteria of Wilk's $\lambda$, D&B(Rij) & Tou are discussed.