• Title/Summary/Keyword: kernel function

Search Result 620, Processing Time 0.026 seconds

Power Quality Disturbances Identification Method Based on Novel Hybrid Kernel Function

  • Zhao, Liquan;Gai, Meijiao
    • Journal of Information Processing Systems
    • /
    • v.15 no.2
    • /
    • pp.422-432
    • /
    • 2019
  • A hybrid kernel function of support vector machine is proposed to improve the classification performance of power quality disturbances. The kernel function mathematical model of support vector machine directly affects the classification performance. Different types of kernel functions have different generalization ability and learning ability. The single kernel function cannot have better ability both in learning and generalization. To overcome this problem, we propose a hybrid kernel function that is composed of two single kernel functions to improve both the ability in generation and learning. In simulations, we respectively used the single and multiple power quality disturbances to test classification performance of support vector machine algorithm with the proposed hybrid kernel function. Compared with other support vector machine algorithms, the improved support vector machine algorithm has better performance for the classification of power quality signals with single and multiple disturbances.

Performance Analysis of Kernel Function for Support Vector Machine (Support Vector Machine에 대한 커널 함수의 성능 분석)

  • Sim, Woo-Sung;Sung, Se-Young;Cheng, Cha-Keon
    • Proceedings of the IEEK Conference
    • /
    • 2009.05a
    • /
    • pp.405-407
    • /
    • 2009
  • SVM(Support Vector Machine) is a classification method which is recently watched in mechanical learning system. Vapnik, Osuna, Platt etc. had suggested methodology in order to solve needed QP(Quadratic Programming) to realize SVM so that have extended application field. SVM find hyperplane which classify into 2 class by converting from input space converter vector to characteristic space vector using Kernel Function. This is very systematic and theoretical more than neural network which is experiential study method. Although SVM has superior generalization characteristic, it depends on Kernel Function. There are three category in the Kernel Function as Polynomial Kernel, RBF(Radial Basis Function) Kernel, Sigmoid Kernel. This paper has analyzed performance of SVM against kernel using virtual data.

  • PDF

On the Support Vector Machine with the kernel of the q-normal distribution

  • Joguchi, Hirofumi;Tanaka, Masaru
    • Proceedings of the IEEK Conference
    • /
    • 2002.07b
    • /
    • pp.983-986
    • /
    • 2002
  • Support Vector Machine (SVM) is one of the methods of pattern recognition that separate input data using hyperplane. This method has high capability of pattern recognition by using the technique, which says kernel trick, and the Radial basis function (RBF) kernel is usually used as a kernel function in kernel trick. In this paper we propose using the q-normal distribution to the kernel function, instead of conventional RBF, and compare two types of the kernel function.

  • PDF

THE GREEN FUNCTION AND THE SZEGŐ KERNEL FUNCTION

  • Chung, Young-Bok
    • Honam Mathematical Journal
    • /
    • v.36 no.3
    • /
    • pp.659-668
    • /
    • 2014
  • In this paper, we express the Green function in terms of the classical kernel functions in potential theory. In particular, we obtain a formula relating the Green function and the Szegő kernel function which consists of only the Szegő kernel function in a $C^{\infty}$ smoothly bounded finitely connected domain in the complex plane.

THE BERGMAN KERNEL FUNCTION AND THE SZEGO KERNEL FUNCTION

  • CHUNG YOUNG-BOK
    • Journal of the Korean Mathematical Society
    • /
    • v.43 no.1
    • /
    • pp.199-213
    • /
    • 2006
  • We compute the holomorphic derivative of the harmonic measure associated to a $C^\infty$bounded domain in the plane and show that the exact Bergman kernel function associated to a $C^\infty$ bounded domain in the plane relates the derivatives of the Ahlfors map and the Szego kernel in an explicit way. We find several formulas for the exact Bergman kernel and the Szego kernel and the harmonic measure. Finally we survey some other properties of the holomorphic derivative of the harmonic measure.

Estimation of kernel function using the measured apparent earth resistivity

  • Kim, Ho-Chan;Boo, Chang-Jin;Kang, Min-Jae
    • International journal of advanced smart convergence
    • /
    • v.9 no.3
    • /
    • pp.97-104
    • /
    • 2020
  • In this paper, we propose a method to derive the kernel function directly from the measured apparent earth resistivity. At this time, the kernel function is obtained through the process of solving a nonlinear system. Nonlinear systems with many variables are difficult to solve. This paper also introduces a method for converting nonlinear derived systems to linear systems. The kernel function is a function of the depth and resistance of the Earth's layer. Being able to derive an accurate kernel function means that we can estimate the earth parameters i.e. layer depth and resistivity. We also use various Earth models as simulation examples to validate the proposed method.

Function Approximation Based on a Network with Kernel Functions of Bounds and Locality : an Approach of Non-Parametric Estimation

  • Kil, Rhee-M.
    • ETRI Journal
    • /
    • v.15 no.2
    • /
    • pp.35-51
    • /
    • 1993
  • This paper presents function approximation based on nonparametric estimation. As an estimation model of function approximation, a three layered network composed of input, hidden and output layers is considered. The input and output layers have linear activation units while the hidden layer has nonlinear activation units or kernel functions which have the characteristics of bounds and locality. Using this type of network, a many-to-one function is synthesized over the domain of the input space by a number of kernel functions. In this network, we have to estimate the necessary number of kernel functions as well as the parameters associated with kernel functions. For this purpose, a new method of parameter estimation in which linear learning rule is applied between hidden and output layers while nonlinear (piecewise-linear) learning rule is applied between input and hidden layers, is considered. The linear learning rule updates the output weights between hidden and output layers based on the Linear Minimization of Mean Square Error (LMMSE) sense in the space of kernel functions while the nonlinear learning rule updates the parameters of kernel functions based on the gradient of the actual output of network with respect to the parameters (especially, the shape) of kernel functions. This approach of parameter adaptation provides near optimal values of the parameters associated with kernel functions in the sense of minimizing mean square error. As a result, the suggested nonparametric estimation provides an efficient way of function approximation from the view point of the number of kernel functions as well as learning speed.

  • PDF

Kernel method for autoregressive data

  • Shim, Joo-Yong;Lee, Jang-Taek
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.5
    • /
    • pp.949-954
    • /
    • 2009
  • The autoregressive process is applied in this paper to kernel regression in order to infer nonlinear models for predicting responses. We propose a kernel method for the autoregressive data which estimates the mean function by kernel machines. We also present the model selection method which employs the cross validation techniques for choosing the hyper-parameters which affect the performance of kernel regression. Artificial and real examples are provided to indicate the usefulness of the proposed method for the estimation of mean function in the presence of autocorrelation between data.

  • PDF

Kernel Hardening by Recovering Kernel Stack Frame in Linux Operating System (리눅스 운영체제에서 커널 스택의 복구를 통한 커널 하드닝)

  • Jang Seung-Ju
    • The KIPS Transactions:PartA
    • /
    • v.13A no.3 s.100
    • /
    • pp.199-204
    • /
    • 2006
  • The kernel hardening function is necessary in terms of kernel stability to reduce the system error or panic due to the kernel code error that is made by program developer. But, the traditional kernel hardening method is difficult to implement and consuming high cost. The suggested kernel hardening function that makes high availability system by changing the panic() function of inside kernel code guarantees normal system operation by recovering the incorrect address of the kernel stack frame. We experimented the kernel hardening function at the network module of the Linux by forcing panic code and confirmed the proposed design mechanism of kernel hardening is working well by this experiment.

Choice of the Kernel Function in Smoothing Moment Restrictions for Dependent Processes

  • Lee, Jin
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.1
    • /
    • pp.137-141
    • /
    • 2009
  • We study on selecting the kernel weighting function in smoothing moment conditions for dependent processes. For hypothesis testing in Generalized Method of Moments or Generalized Empirical Likelihood context, we find that smoothing moment conditions by Bartlett kernel delivers smallest size distortions based on empirical Edgeworth expansions of the long-run variance estimator.