• Title/Summary/Keyword: support vector machine

Search Result 1,764, Processing Time 0.028 seconds

Prediction Performance of Hybrid Least Square Support Vector Machine with First Principle Knowledge (First Principle을 결합한 최소제곱 Support Vector Machine의 예측 능력)

  • 김병주;심주용;황창하;김일곤
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.7_8
    • /
    • pp.744-751
    • /
    • 2003
  • A hybrid least square Support Vector Machine combined with First Principle(FP) knowledge is proposed. We compare hybrid least square Support Vector Machine(HLS-SVM) with early proposed models such as Hybrid Neural Network(HNN) and HNN with Extended Kalman Filter(HNN-EKF). In the training and validation stage HLS-SVM shows similar performance with HNN-EKF but better than HNN, whereas, in the testing stage, it shows three times better than HNN-EKF, hundred times better than HNN model.

WHEN CAN SUPPORT VECTOR MACHINE ACHIEVE FAST RATES OF CONVERGENCE?

  • Park, Chang-Yi
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.3
    • /
    • pp.367-372
    • /
    • 2007
  • Classification as a tool to extract information from data plays an important role in science and engineering. Among various classification methodologies, support vector machine has recently seen significant developments. The central problem this paper addresses is the accuracy of support vector machine. In particular, we are interested in the situations where fast rates of convergence to the Bayes risk can be achieved by support vector machine. Through learning examples, we illustrate that support vector machine may yield fast rates if the space spanned by an adopted kernel is sufficiently large.

Development of Fuzzy Support Vector Machine and Evaluation of Performance Using Ionosphere Radar Data (Fuzzy Twin Support Vector Machine 개발 및 전리층 레이더 데이터를 통한 성능 평가)

  • Cheon, Min-Kyu;Yoon, Chang-Yong;Kim, Eun-Tai;Park, Mig-Non
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.4
    • /
    • pp.549-554
    • /
    • 2008
  • Support Vector machine is the classifier which is based on the statistical training theory. Twin Support Vector Machine(TWSVM) is a kind of binary classifier that determines two nonparallel planes by solving two related SVM-type problems. The training time of TWSVM is shorter than that of SVM, but TWSVM doesn't shows worse performance than that of SVM. This paper proposes the TWSVM which is applied fuzzy membership, and compares the performance of this classifier with the other classifiers using Ionosphere radar data set.

Design of SVM-Based Polynomial Neural Networks Classifier Using Particle Swarm Optimization (입자군집 최적화를 이용한 SVM 기반 다항식 뉴럴 네트워크 분류기 설계)

  • Roh, Seok-Beom;Oh, Sung-Kwun
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.8
    • /
    • pp.1071-1079
    • /
    • 2018
  • In this study, the design methodology as well as network architecture of Support Vector Machine based Polynomial Neural Network, which is a kind of the dynamically generated neural networks, is introduced. The Support Vector Machine based polynomial neural networks is given as a novel network architecture redesigned with the aid of polynomial neural networks and Support Vector Machine. The generic polynomial neural networks, whose nodes are made of polynomials, are dynamically generated in each layer-wise. The individual nodes of the support vector machine based polynomial neural networks is constructed as a support vector machine, and the nodes as well as layers of the support vector machine based polynomial neural networks are dynamically generated as like the generation process of the generic polynomial neural networks. Support vector machine is well known as a sort of robust pattern classifiers. In addition, in order to enhance the structural flexibility as well as the classification performance of the proposed classifier, multi-objective particle swarm optimization is used. In other words, the optimization algorithm leads to sequentially successive generation of each layer of support vector based polynomial neural networks. The bench mark data sets are used to demonstrate the pattern classification performance of the proposed classifiers through the comparison of the generalization ability of the proposed classifier with some already studied classifiers.

Fuzzy One Class Support Vector Machine (퍼지 원 클래스 서포트 벡터 머신)

  • Kim, Ki-Joo;Choi, Young-Sik
    • Journal of Internet Computing and Services
    • /
    • v.6 no.3
    • /
    • pp.159-170
    • /
    • 2005
  • OC-SVM(One Class Support Vector Machine) avoids solving a full density estimation problem, and instead focuses on a simpler task, estimating quantiles of a data distribution, i.e. its support. OC-SVM seeks to estimate regions where most of data resides and represents the regions as a function of the support vectors, Although OC-SVM is powerful method for data description, it is difficult to incorporate human subjective importance into its estimation process, In order to integrate the importance of each point into the OC-SVM process, we propose a fuzzy version of OC-SVM. In FOC-SVM (Fuzzy One-Class Support Vector Machine), we do not equally treat data points and instead weight data points according to the importance measure of the corresponding objects. That is, we scale the kernel feature vector according to the importance measure of the object so that a kernel feature vector of a less important object should contribute less to the detection process of OC-SVM. We demonstrate the performance of our algorithm on several synthesized data sets, Experimental results showed the promising results.

  • PDF

Development of Fuzzy Support Vector Machine for Pattern Classification (패턴 분류를 위한 Fuzzy Twin Support Vector machine 개발)

  • Cheon, Min-Gyu;Yun, Chang-Yong;Kim, Eun-Tae;Park, Min-Yong
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2007.11a
    • /
    • pp.279-282
    • /
    • 2007
  • Support Vector Machine(SVM)은 통계적 학습 이론에 기반을 둔 분류기이다. 또한 Twin Support Vector Machine(TWSVM)은 이진 SVM 분류기의 한 종류로써, 서로 관련된 두 개의 SVM 유형 문제를 통해 평행하지 않은 두 개의 평면을 결정하고 이 두 평면을 통해 분류기를 완성하는 방식이다. 이러한 방식은 TWSVM은 학습 시간이 SVM에 비해 훨씬 짧으며, SVM과 비교하여 떨어지지 않는 성능을 보여준다. 본 논문은 분류기 입력에 Fuzzy Memvership을 적용하는 방식의 TWSVM을 제안하고, 2차원 벡터 입력에 대한 실험을 통하여 기존에 제시 되었던 TWSVM과 비교한다.

  • PDF

Power Quality Disturbances Identification Method Based on Novel Hybrid Kernel Function

  • Zhao, Liquan;Gai, Meijiao
    • Journal of Information Processing Systems
    • /
    • v.15 no.2
    • /
    • pp.422-432
    • /
    • 2019
  • A hybrid kernel function of support vector machine is proposed to improve the classification performance of power quality disturbances. The kernel function mathematical model of support vector machine directly affects the classification performance. Different types of kernel functions have different generalization ability and learning ability. The single kernel function cannot have better ability both in learning and generalization. To overcome this problem, we propose a hybrid kernel function that is composed of two single kernel functions to improve both the ability in generation and learning. In simulations, we respectively used the single and multiple power quality disturbances to test classification performance of support vector machine algorithm with the proposed hybrid kernel function. Compared with other support vector machine algorithms, the improved support vector machine algorithm has better performance for the classification of power quality signals with single and multiple disturbances.

Multi-pattern Classification Using Kernel Bagging-based Import Vector Machine (커널 Bagging기반의 Import Vector Machine을 이용한 다중 패턴 분류)

  • 최준혁;김대수;임기욱
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2002.12a
    • /
    • pp.275-278
    • /
    • 2002
  • Vapnik이 제안한 Support Vector Machine은 두 개의 부류를 갖는 데이터에 대한 분류에는 매우 좋은 성능을 보인다는 점은 이미 잘 알려져 있다. 하지만 부류의 개수가 3개 이상인 다중 패턴을 갖는 데이터에 대한 분류에는 SVM을 적용하기가 쉽지 않다. Support Vector Machine의 이러한 문제점을 해결하기 위하여 Zhu는 3개 이상의 부류를 갖는 데이터의 패턴 분류를 위하여 Import Vector Machine을 제안하였다. 이 모형은 Support Vector Machine을 이용하여 해결하기 어려운 다중 패턴 분류를 가능케 한다. Import Vector Machine은 커널 로지스틱 기반의 함수만을 사용하지만 본 논문에서는 다수의 커널 함수를 적용하여 가장 성능이 우수한 커널 함수를 찾아내어 최종 분류를 수행하게되는 bagging 기법을 적용하였다 제안하는 방법이 기존의 방법에 비해, 더욱 정확한 분류를 수행함을 실험 결과를 통해 확인한다.

A Differential Evolution based Support Vector Clustering (차분진화 기반의 Support Vector Clustering)

  • Jun, Sung-Hae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.5
    • /
    • pp.679-683
    • /
    • 2007
  • Statistical learning theory by Vapnik consists of support vector machine(SVM), support vector regression(SVR), and support vector clustering(SVC) for classification, regression, and clustering respectively. In this algorithms, SVC is good clustering algorithm using support vectors based on Gaussian kernel function. But, similar to SVM and SVR, SVC needs to determine kernel parameters and regularization constant optimally. In general, the parameters have been determined by the arts of researchers and grid search which is demanded computing time heavily. In this paper, we propose a differential evolution based SVC(DESVC) which combines differential evolution into SVC for efficient selection of kernel parameters and regularization constant. To verify improved performance of our DESVC, we make experiments using the data sets from UCI machine learning repository and simulation.

Hybrid Learning Algorithm for Improving Performance of Regression Support Vector Machine (회귀용 Support Vector Machine의 성능개선을 위한 조합형 학습알고리즘)

  • Jo, Yong-Hyeon;Park, Chang-Hwan;Park, Yong-Su
    • The KIPS Transactions:PartB
    • /
    • v.8B no.5
    • /
    • pp.477-484
    • /
    • 2001
  • This paper proposes a hybrid learning algorithm combined momentum and kernel-adatron for improving the performance of regression support vector machine. The momentum is utilized for high-speed convergence by restraining the oscillation in the process of converging to the optimal solution, and the kernel-adatron algorithm is also utilized for the capability by working in nonlinear feature spaces and the simple implementation. The proposed algorithm has been applied to the 1-dimension and 2-dimension nonlinear function regression problems. The simulation results show that the proposed algorithm has better the learning speed and performance of the regression, in comparison with those quadratic programming and kernel-adatron algorithm.

  • PDF