• Title/Summary/Keyword: Kernel Space

Search Result 237, Processing Time 0.022 seconds

Separation of Kernel Space and User Space in Zephyr Kernel (Zephyr 커널에서 커널 공간과 사용자 공간의 분리 구현)

  • Kim, Eunyoung;Shin, Dongha
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.13 no.4
    • /
    • pp.187-194
    • /
    • 2018
  • The operating system for IoT should have a small memory footprint and provide low power state, real-time, multitasking, various network protocols, and security. Although the Zephyr kernel, an operating system for IoT, released by the Linux Foundation in February 2016, has these features but errors generated by the user code can generate fatal problems in the system because the Zephyr kernel adopts a single-space method that both the user code and kernel code execute in the same space. In this research, we propose a space separation method, which separates kernel space and user space, to solve this problem. The space separation that we propose consists of three modifications in Zephyr kernel. The first is the code separation that kernel code and user code execute in each space while using different stacks. The second is the kernel space protection that generates an exception by using the MPU (Memory Protection Unit) when the user code accesses the kernel space. The third is the SVC based system call that executes the system call using the SVC instruction that generates the exception. In this research, we implemented the space separation in Zephyr v1.8.0 and evaluated safety through abnormal execution of the user code. As the result, the kernel was not crashed by the errors generated by the user code and was normally executed.

KERNEL OPERATORS ON FOCK SPACE

  • Bahn, Chang-Soo;Ko, Chul-Ki;Park, Yong-Moon
    • Journal of the Korean Mathematical Society
    • /
    • v.35 no.3
    • /
    • pp.527-538
    • /
    • 1998
  • We study on kernel operators (Wick monomials) on symmetric Fock space. We give optimal conditions on kernels so that the corresponding kernel operators are densely defined linear operators on the Fock space. We try to formulate our results in the framework of white noise analysis as much as possible. The most of the results in this paper can be extended to anti-symmetric Fock space.

  • PDF

REPRODUCING KERNEL KREIN SPACES

  • Yang, Mee-Hyea
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.2
    • /
    • pp.659-668
    • /
    • 2001
  • Let S(z) be a power series with operator coefficients such that multiplication by S(z) is an everywhere defined transformation in the square summable power series C(z). In this paper we show that there exists a reproducing kernel Krein space which is state space of extended canonical linear system with transfer function S(z). Also we characterize the reproducing kernel function of the state space of a linear system.

Subtype classification of Human Breast Cancer via Kernel methods and Pattern Analysis of Clinical Outcome over the feature space (Kernel Methods를 이용한 Human Breast Cancer의 subtype의 분류 및 Feature space에서 Clinical Outcome의 pattern 분석)

  • Kim, Hey-Jin;Park, Seungjin;Bang, Sung-Uang
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2003.04c
    • /
    • pp.175-177
    • /
    • 2003
  • This paper addresses a problem of classifying human breast cancer into its subtypes. A main ingredient in our approach is kernel machines such as support vector machine (SVM). kernel principal component analysis (KPCA). and kernel partial least squares (KPLS). In the task of breast cancer classification, we employ both SVM and KPLS and compare their results. In addition to this classification. we also analyze the patterns of clinical outcomes in the feature space. In order to visualize the clinical outcomes in low-dimensional space, both KPCA and KPLS are used. It turns out that these methods are useful to identify correlations between clinical outcomes and the nonlinearly protected expression profiles in low-dimensional feature space.

  • PDF

Sparse Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.12a
    • /
    • pp.60-64
    • /
    • 2001
  • In this paper, a new learning methodology for Kernel Methods is suggested that results in a sparse representation of kernel space from the training patterns for classification problems. Among the traditional algorithms of linear discriminant function(perceptron, relaxation, LMS(least mean squared), pseudoinverse), this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epochs. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

Fuzzy Kernel K-Nearest Neighbor Algorithm for Image Segmentation (영상 분할을 위한 퍼지 커널 K-nearest neighbor 알고리즘)

  • Choi Byung-In;Rhee Chung-Hoon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.7
    • /
    • pp.828-833
    • /
    • 2005
  • Kernel methods have shown to improve the performance of conventional linear classification algorithms for complex distributed data sets, as mapping the data in input space into a higher dimensional feature space(7). In this paper, we propose a fuzzy kernel K-nearest neighbor(fuzzy kernel K-NN) algorithm, which applies the distance measure in feature space based on kernel functions to the fuzzy K-nearest neighbor(fuzzy K-NN) algorithm. In doing so, the proposed algorithm can enhance the Performance of the conventional algorithm, by choosing an appropriate kernel function. Results on several data sets and segmentation results for real images are given to show the validity of our proposed algorithm.

Spare Representation Learning of Kernel Space Using the Kernel Relaxation Procedure (커널 이완 절차에 의한 커널 공간의 저밀도 표현 학습)

  • 류재홍;정종철
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.9
    • /
    • pp.817-821
    • /
    • 2001
  • In this paper, a new learning methodology for kernel methods that results in a sparse representation of kernel space from the training patterns for classification problems is suggested. Among the traditional algorithms of linear discriminant function, this paper shows that the relaxation procedure can obtain the maximum margin separating hyperplane of linearly separable pattern classification problem as SVM(Support Vector Machine) classifier does. The original relaxation method gives only the necessary condition of SV patterns. We suggest the sufficient condition to identify the SV patterns in the learning epoches. For sequential learning of kernel methods, extended SVM and kernel discriminant function are defined. Systematic derivation of learning algorithm is introduced. Experiment results show the new methods have the higher or equivalent performance compared to the conventional approach.

  • PDF

Arrow Diagrams for Kernel Principal Component Analysis

  • Huh, Myung-Hoe
    • Communications for Statistical Applications and Methods
    • /
    • v.20 no.3
    • /
    • pp.175-184
    • /
    • 2013
  • Kernel principal component analysis(PCA) maps observations in nonlinear feature space to a reduced dimensional plane of principal components. We do not need to specify the feature space explicitly because the procedure uses the kernel trick. In this paper, we propose a graphical scheme to represent variables in the kernel principal component analysis. In addition, we propose an index for individual variables to measure the importance in the principal component plane.

New Fuzzy Inference System Using a Kernel-based Method

  • Kim, Jong-Cheol;Won, Sang-Chul;Suga, Yasuo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2393-2398
    • /
    • 2003
  • In this paper, we proposes a new fuzzy inference system for modeling nonlinear systems given input and output data. In the suggested fuzzy inference system, the number of fuzzy rules and parameter values of membership functions are automatically decided by using the kernel-based method. The kernel-based method individually performs linear transformation and kernel mapping. Linear transformation projects input space into linearly transformed input space. Kernel mapping projects linearly transformed input space into high dimensional feature space. The structure of the proposed fuzzy inference system is equal to a Takagi-Sugeno fuzzy model whose input variables are weighted linear combinations of input variables. In addition, the number of fuzzy rules can be reduced under the condition of optimizing a given criterion by adjusting linear transformation matrix and parameter values of kernel functions using the gradient descent method. Once a structure is selected, coefficients in consequent part are determined by the least square method. Simulated result illustrates the effectiveness of the proposed technique.

  • PDF

Performance Analysis of Kernel Function for Support Vector Machine (Support Vector Machine에 대한 커널 함수의 성능 분석)

  • Sim, Woo-Sung;Sung, Se-Young;Cheng, Cha-Keon
    • Proceedings of the IEEK Conference
    • /
    • 2009.05a
    • /
    • pp.405-407
    • /
    • 2009
  • SVM(Support Vector Machine) is a classification method which is recently watched in mechanical learning system. Vapnik, Osuna, Platt etc. had suggested methodology in order to solve needed QP(Quadratic Programming) to realize SVM so that have extended application field. SVM find hyperplane which classify into 2 class by converting from input space converter vector to characteristic space vector using Kernel Function. This is very systematic and theoretical more than neural network which is experiential study method. Although SVM has superior generalization characteristic, it depends on Kernel Function. There are three category in the Kernel Function as Polynomial Kernel, RBF(Radial Basis Function) Kernel, Sigmoid Kernel. This paper has analyzed performance of SVM against kernel using virtual data.

  • PDF