• 제목/요약/키워드: Learning Support Function

검색결과 200건 처리시간 0.023초

An Analysis of University Students' Needs for Learning Support Functions of Learning Management System Augmented with Artificial Intelligence Technology

  • Jeonghyun, Yun;Taejung, Park
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제17권1호
    • /
    • pp.1-15
    • /
    • 2023
  • The aim of this study is to identify intelligent learning support functions in Learning Management System (LMS) to support university student learning activities during the transition from face-to-face classes to online learning. To accomplish this, we investigated the perceptions of students on the levels of importance and urgency toward learning support functions of LMS powered with Artificial Intelligent (AI) technology and analyzed the differences in perception according to student characteristics. As a result of this study, the function that students considered to be the most important and felt an urgent need to adopt was to give automated grading and feedback for their writing assignments. The functions with the next highest score in importance and urgency were related to receiving customized feedback and help on task performance processed as well as results in the learning progress. In addition, students view a function to receive customized feedback according to their own learning plan and progress and to receive suggestions for improvement by diagnosing their strengths and weaknesses to be both vitally important and urgently needed. On the other hand, the learning support function of LMS, which was ranked as low importance and urgency, was a function that analyzed the interaction between professors and students and between fellow students. It is expected that the results of this student needs analysis will be helpful in deriving the contents of learning support functions that should be developed as well as providing basic information for prioritizing when applying AI technology to implement learner-centered LMS in the future.

Power Quality Disturbances Identification Method Based on Novel Hybrid Kernel Function

  • Zhao, Liquan;Gai, Meijiao
    • Journal of Information Processing Systems
    • /
    • 제15권2호
    • /
    • pp.422-432
    • /
    • 2019
  • A hybrid kernel function of support vector machine is proposed to improve the classification performance of power quality disturbances. The kernel function mathematical model of support vector machine directly affects the classification performance. Different types of kernel functions have different generalization ability and learning ability. The single kernel function cannot have better ability both in learning and generalization. To overcome this problem, we propose a hybrid kernel function that is composed of two single kernel functions to improve both the ability in generation and learning. In simulations, we respectively used the single and multiple power quality disturbances to test classification performance of support vector machine algorithm with the proposed hybrid kernel function. Compared with other support vector machine algorithms, the improved support vector machine algorithm has better performance for the classification of power quality signals with single and multiple disturbances.

함수 근사를 위한 점증적 서포트 벡터 학습 방법 (Incremental Support Vector Learning Method for Function Approximation)

  • 임채환;박주영
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2002년도 하계종합학술대회 논문집(3)
    • /
    • pp.135-138
    • /
    • 2002
  • This paper addresses incremental learning method for regression. SVM(support vector machine) is a recently proposed learning method. In general training a support vector machine requires solving a QP (quadratic programing) problem. For very large dataset or incremental dataset, solving QP problems may be inconvenient. So this paper presents an incremental support vector learning method for function approximation problems.

  • PDF

Improvement of Support Vector Clustering using Evolutionary Programming and Bootstrap

  • Jun, Sung-Hae
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제8권3호
    • /
    • pp.196-201
    • /
    • 2008
  • Statistical learning theory has three analytical tools which are support vector machine, support vector regression, and support vector clustering for classification, regression, and clustering respectively. In general, their performances are good because they are constructed by convex optimization. But, there are some problems in the methods. One of the problems is the subjective determination of the parameters for kernel function and regularization by the arts of researchers. Also, the results of the learning machines are depended on the selected parameters. In this paper, we propose an efficient method for objective determination of the parameters of support vector clustering which is the clustering method of statistical learning theory. Using evolutionary algorithm and bootstrap method, we select the parameters of kernel function and regularization constant objectively. To verify improved performances of proposed research, we compare our method with established learning algorithms using the data sets form ucr machine learning repository and synthetic data.

e-Learning 콘텐츠의 남북한 표준언어 지원시스템 연구 (A study on Support System for Standard Korean Language of e-Learning Contents)

  • 최성;정지문;유갑상
    • 디지털융복합연구
    • /
    • 제5권2호
    • /
    • pp.25-36
    • /
    • 2007
  • In this paper, we studied on the effective structure of an e-Learning Korean Support System for foreigner based on computer systems which is to obey the rules of IMS/AICC International Standard regulations based on LCMS and SCORM. The most important task on this study is to support the function of self-study module through the review of the analysis and results of Korean learning and learning customs. We studied the effective PMS detail modules as well as the Standard Competency Module Management System, which related to LMS/LCMS, Learning an Individual Competency Management System, Competency Registry/Repository System, Knowledge Management System based on Community Competency Module, Education e-survey System and Module learning Support Service System. We suggested one of standard Effective Model of learning Korean Support System which is adopted in a various techniques for foreigner.

  • PDF

Estimating Regression Function with $\varepsilon-Insensitive$ Supervised Learning Algorithm

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제15권2호
    • /
    • pp.477-483
    • /
    • 2004
  • One of the major paradigms for supervised learning in neural network community is back-propagation learning. The standard implementations of back-propagation learning are optimal under the assumptions of identical and independent Gaussian noise. In this paper, for regression function estimation, we introduce $\varepsilon-insensitive$ back-propagation learning algorithm, which corresponds to minimizing the least absolute error. We compare this algorithm with support vector machine(SVM), which is another $\varepsilon-insensitive$ supervised learning algorithm and has been very successful in pattern recognition and function estimation problems. For comparison, we consider a more realistic model would allow the noise variance itself to depend on the input variables.

  • PDF

COMPARATIVE STUDY OF THE PERFORMANCE OF SUPPORT VECTOR MACHINES WITH VARIOUS KERNELS

  • Nam, Seong-Uk;Kim, Sangil;Kim, HyunMin;Yu, YongBin
    • East Asian mathematical journal
    • /
    • 제37권3호
    • /
    • pp.333-354
    • /
    • 2021
  • A support vector machine (SVM) is a state-of-the-art machine learning model rooted in structural risk minimization. SVM is underestimated with regards to its application to real world problems because of the difficulties associated with its use. We aim at showing that the performance of SVM highly depends on which kernel function to use. To achieve these, after providing a summary of support vector machines and kernel function, we constructed experiments with various benchmark datasets to compare the performance of various kernel functions. For evaluating the performance of SVM, the F1-score and its Standard Deviation with 10-cross validation was used. Furthermore, we used taylor diagrams to reveal the difference between kernels. Finally, we provided Python codes for all our experiments to enable re-implementation of the experiments.

Simple Graphs for Complex Prediction Functions

  • Huh, Myung-Hoe;Lee, Yong-Goo
    • Communications for Statistical Applications and Methods
    • /
    • 제15권3호
    • /
    • pp.343-351
    • /
    • 2008
  • By supervised learning with p predictors, we frequently obtain a prediction function of the form $y\;=\;f(x_1,...,x_p)$. When $p\;{\geq}\;3$, it is not easy to understand the inner structure of f, except for the case the function is formulated as additive. In this study, we propose to use p simple graphs for visual understanding of complex prediction functions produced by several supervised learning engines such as LOESS, neural networks, support vector machines and random forests.

정해진 기저함수가 포함되는 Nu-SVR 학습방법 (Nu-SVR Learning with Predetermined Basis Functions Included)

  • 김영일;조원희;박주영
    • 한국지능시스템학회논문지
    • /
    • 제13권3호
    • /
    • pp.316-321
    • /
    • 2003
  • 최근들어, 서포트 벡터 학습은 패턴 분류, 함수 근사 및 비정상 상태 탐지 등의 분야에서 상당한 관심을 끌고 있다. 여러가지 서포트 벡터 학습 방법들 중 누-버전(nu-versions)으로 불리는 방법들은 서포트 벡터의 개수를 제어해야할 필요가 있는 경우에는 특히 유용한 것으로 알려져 있다. 본 논문에서는, $\nu-SVR$로 불리는 누-버전 서포트 벡터 학습 방법과 미리 정해진 기저함수를 모두 활용하는 함수 근사 문제를 고려한다. $\varepsilon-SVR$, $\nu-SVR$ 및 세미-파라메트릭 함수 근사 방법론등을 복습한 후에, 본 논문은 정해진 기저함수를 이용할 수 있는 방향으로 기존의 $\nu-SVR$ 방법을 확장하는 방안을 제시한다. 그리고, 제안된 방법의 적용가능성이 예제를 통하여 보여진다.

Deep LS-SVM for regression

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제27권3호
    • /
    • pp.827-833
    • /
    • 2016
  • In this paper, we propose a deep least squares support vector machine (LS-SVM) for regression problems, which consists of the input layer and the hidden layer. In the hidden layer, LS-SVMs are trained with the original input variables and the perturbed responses. For the final output, the main LS-SVM is trained with the outputs from LS-SVMs of the hidden layer as input variables and the original responses. In contrast to the multilayer neural network (MNN), LS-SVMs in the deep LS-SVM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep LS-SVM are entirely different from MNN in which all weights and biases are trained to minimize one final error function. When compared to MNN approaches, the deep LS-SVM does not make use of any combination weights, but trains all LS-SVMs in the architecture. Experimental results from real datasets illustrate that the deep LS-SVM significantly outperforms state of the art machine learning methods on regression problems.