• Title/Summary/Keyword: Kernel Discriminant

Search Result 32, Processing Time 0.024 seconds

A Kernel Approach to Discriminant Analysis for Binary Classification

  • Shin, Yang-Kyu
    • Journal of the Korean Data and Information Science Society
    • /
    • v.12 no.2
    • /
    • pp.83-93
    • /
    • 2001
  • We investigate a kernel approach to discriminant analysis for binary classification as a machine learning point of view. Our view of the kernel approach follows support vector method which is one of the most promising techniques in the area of machine learning. As usual discriminant analysis, the kernel method can discriminate an object most likely belongs to. Moreover, it has some advantage over discriminant analysis such as data compression and computing time.

  • PDF

Kernel Fisher Discriminant Analysis for Indoor Localization

  • Ngo, Nhan V.T.;Park, Kyung Yong;Kim, Jeong G.
    • International journal of advanced smart convergence
    • /
    • v.4 no.2
    • /
    • pp.177-185
    • /
    • 2015
  • In this paper we introduce Kernel Fisher Discriminant Analysis (KFDA) to transform our database of received signal strength (RSS) measurements into a smaller dimension space to maximize the difference between reference points (RP) as possible. By KFDA, we can efficiently utilize RSS data than other method so that we can achieve a better performance.

Spare Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완 절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.9
    • /
    • pp.817-821
    • /
    • 2001
  • In this paper, a new learning methodology for kernel methods that results in a sparse representation of kernel space from the training patterns for classification problems is suggested. Among the traditional algorithms of linear discriminant function, this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epoches. For sequential learning of kernel methods, extended SVM and kernel discriminant function are defined. Systematic derivation of learning algorithm is introduced. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

Sparse Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.12a
    • /
    • pp.60-64
    • /
    • 2001
  • In this paper, a new learning methodology for Kernel Methods is suggested that results in a sparse representation of kernel space from the training patterns for classification problems. Among the traditional algorithms of linear discriminant function(perceptron, relaxation, LMS(least mean squared), pseudoinverse), this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epochs. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

Sonar Target Classification using Generalized Discriminant Analysis (일반화된 판별분석 기법을 이용한 능동소나 표적 식별)

  • Kim, Dong-wook;Kim, Tae-hwan;Seok, Jong-won;Bae, Keun-sung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.1
    • /
    • pp.125-130
    • /
    • 2018
  • Linear discriminant analysis is a statistical analysis method that is generally used for dimensionality reduction of the feature vectors or for class classification. However, in the case of a data set that cannot be linearly separated, it is possible to make a linear separation by mapping a feature vector into a higher dimensional space using a nonlinear function. This method is called generalized discriminant analysis or kernel discriminant analysis. In this paper, we carried out target classification experiments with active sonar target signals available on the Internet using both liner discriminant and generalized discriminant analysis methods. Experimental results are analyzed and compared with discussions. For 104 test data, LDA method has shown correct recognition rate of 73.08%, however, GDA method achieved 95.19% that is also better than the conventional MLP or kernel-based SVM.

Kernel Fisher Discriminant Analysis for Natural Gait Cycle Based Gait Recognition

  • Huang, Jun;Wang, Xiuhui;Wang, Jun
    • Journal of Information Processing Systems
    • /
    • v.15 no.4
    • /
    • pp.957-966
    • /
    • 2019
  • This paper studies a novel approach to natural gait cycles based gait recognition via kernel Fisher discriminant analysis (KFDA), which can effectively calculate the features from gait sequences and accelerate the recognition process. The proposed approach firstly extracts the gait silhouettes through moving object detection and segmentation from each gait videos. Secondly, gait energy images (GEIs) are calculated for each gait videos, and used as gait features. Thirdly, KFDA method is used to refine the extracted gait features, and low-dimensional feature vectors for each gait videos can be got. The last is the nearest neighbor classifier is applied to classify. The proposed method is evaluated on the CASIA and USF gait databases, and the results show that our proposed algorithm can get better recognition effect than other existing algorithms.

An Adaptive Face Recognition System Based on a Novel Incremental Kernel Nonparametric Discriminant Analysis

  • SOULA, Arbia;SAID, Salma BEN;KSANTINI, Riadh;LACHIRI, Zied
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.4
    • /
    • pp.2129-2147
    • /
    • 2019
  • This paper introduces an adaptive face recognition method based on a Novel Incremental Kernel Nonparametric Discriminant Analysis (IKNDA) that is able to learn through time. More precisely, the IKNDA has the advantage of incrementally reducing data dimension, in a discriminative manner, as new samples are added asynchronously. Thus, it handles dynamic and large data in a better way. In order to perform face recognition effectively, we combine the Gabor features and the ordinal measures to extract the facial features that are coded across local parts, as visual primitives. The variegated ordinal measures are extraught from Gabor filtering responses. Then, the histogram of these primitives, across a variety of facial zones, is intermingled to procure a feature vector. This latter's dimension is slimmed down using PCA. Finally, the latter is treated as a facial vector input for the advanced IKNDA. A comparative evaluation of the IKNDA is performed for face recognition, besides, for other classification endeavors, in a decontextualized evaluation schemes. In such a scheme, we compare the IKNDA model to some relevant state-of-the-art incremental and batch discriminant models. Experimental results show that the IKNDA outperforms these discriminant models and is better tool to improve face recognition performance.

UFKLDA: An unsupervised feature extraction algorithm for anomaly detection under cloud environment

  • Wang, GuiPing;Yang, JianXi;Li, Ren
    • ETRI Journal
    • /
    • v.41 no.5
    • /
    • pp.684-695
    • /
    • 2019
  • In a cloud environment, performance degradation, or even downtime, of virtual machines (VMs) usually appears gradually along with anomalous states of VMs. To better characterize the state of a VM, all possible performance metrics are collected. For such high-dimensional datasets, this article proposes a feature extraction algorithm based on unsupervised fuzzy linear discriminant analysis with kernel (UFKLDA). By introducing the kernel method, UFKLDA can not only effectively deal with non-Gaussian datasets but also implement nonlinear feature extraction. Two sets of experiments were undertaken. In discriminability experiments, this article introduces quantitative criteria to measure discriminability among all classes of samples. The results show that UFKLDA improves discriminability compared with other popular feature extraction algorithms. In detection accuracy experiments, this article computes accuracy measures of an anomaly detection algorithm (i.e., C-SVM) on the original performance metrics and extracted features. The results show that anomaly detection with features extracted by UFKLDA improves the accuracy of detection in terms of sensitivity and specificity.

Palatability Grading Analysis of Hanwoo Beef using Sensory Properties and Discriminant Analysis (관능특성 및 판별함수를 이용한 한우고기 맛 등급 분석)

  • Cho, Soo-Hyun;Seo, Gu-Reo-Un-Dal-Nim;Kim, Dong-Hun;Kim, Jae-Hee
    • Food Science of Animal Resources
    • /
    • v.29 no.1
    • /
    • pp.132-139
    • /
    • 2009
  • The objective of this study was to investigate the most effective analysis methods for palatability grading of Hanwoo beef by comparing the results of discriminant analysis with sensory data. The sensory data were obtained from sensory testing by 1,300 consumers evaluated tenderness, juiciness, flavor-likeness and overall acceptability of Hanwoo beef samples prepared by boiling, roasting and grilling cooking methods. For the discriminant analysis with one factor, overall acceptability, the linear discriminant functions and the non-parametric discriminant function with the Gaussian kernel were estimated. The linear discriminant functions were simple and easy to understand while the non-parametric discriminant functions were not explicit and had the problem of selection of kernel function and bandwidth. With the three palatability factors such as tenderness, juiciness and flavor-likeness, the canonical discriminant analysis was used and the ability of classification was calculated with the accurate classification rate and the error rate. The canonical discriminant analysis did not need the specific distributional assumptions and only used the principal component and canonical correlation. Also, it contained the function of 3 factors (tenderness, juiciness and flavor-likeness) and accurate classification rate was similar with the other discriminant methods. Therefore, the canonical discriminant analysis was the most proper method to analyze the palatability grading of Hanwoo beef.

Semiparametric Kernel Fisher Discriminant Approach for Regression Problems

  • Park, Joo-Young;Cho, Won-Hee;Kim, Young-Il
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.3 no.2
    • /
    • pp.227-232
    • /
    • 2003
  • Recently, support vector learning attracts an enormous amount of interest in the areas of function approximation, pattern classification, and novelty detection. One of the main reasons for the success of the support vector machines(SVMs) seems to be the availability of global and sparse solutions. Among the approaches sharing the same reasons for success and exhibiting a similarly good performance, we have KFD(kernel Fisher discriminant) approach. In this paper, we consider the problem of function approximation utilizing both predetermined basis functions and the KFD approach for regression. After reviewing support vector regression, semi-parametric approach for including predetermined basis functions, and the KFD regression, this paper presents an extension of the conventional KFD approach for regression toward the direction that can utilize predetermined basis functions. The applicability of the presented method is illustrated via a regression example.