• Title/Summary/Keyword: support vector regression.

Search Result 554, Processing Time 0.03 seconds

Recognition of rolling bearing fault patterns and sizes based on two-layer support vector regression machines

  • Shen, Changqing;Wang, Dong;Liu, Yongbin;Kong, Fanrang;Tse, Peter W.
    • Smart Structures and Systems
    • /
    • v.13 no.3
    • /
    • pp.453-471
    • /
    • 2014
  • The fault diagnosis of rolling element bearings has drawn considerable research attention in recent years because these fundamental elements frequently suffer failures that could result in unexpected machine breakdowns. Artificial intelligence algorithms such as artificial neural networks (ANNs) and support vector machines (SVMs) have been widely investigated to identify various faults. However, as the useful life of a bearing deteriorates, identifying early bearing faults and evaluating their sizes of development are necessary for timely maintenance actions to prevent accidents. This study proposes a new two-layer structure consisting of support vector regression machines (SVRMs) to recognize bearing fault patterns and track the fault sizes. The statistical parameters used to track the fault evolutions are first extracted to condense original vibration signals into a few compact features. The extracted features are then used to train the proposed two-layer SVRMs structure. Once these parameters of the proposed two-layer SVRMs structure are determined, the features extracted from other vibration signals can be used to predict the unknown bearing health conditions. The effectiveness of the proposed method is validated by experimental datasets collected from a test rig. The results demonstrate that the proposed method is highly accurate in differentiating between fault patterns and determining their fault severities. Further, comparisons are performed to show that the proposed method is better than some existing methods.

Support vector expectile regression using IRWLS procedure

  • Choi, Kook-Lyeol;Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.4
    • /
    • pp.931-939
    • /
    • 2014
  • In this paper we propose the iteratively reweighted least squares procedure to solve the quadratic programming problem of support vector expectile regression with an asymmetrically weighted squares loss function. The proposed procedure enables us to select the appropriate hyperparameters easily by using the generalized cross validation function. Through numerical studies on the artificial and the real data sets we show the effectiveness of the proposed method on the estimation performances.

사례기반추론을 이용한 다이렉트 마케팅의 고객반응예측모형의 통합

  • Hong, Taeho;Park, Jiyoung
    • The Journal of Information Systems
    • /
    • v.18 no.3
    • /
    • pp.375-399
    • /
    • 2009
  • In this study, we propose a integrated model of logistic regression, artificial neural networks, support vector machines(SVM), with case-based reasoning(CBR). To predict respondents in the direct marketing is the binary classification problem as like bankruptcy prediction, IDS, churn management and so on. To solve the binary problems, we employed logistic regression, artificial neural networks, SVM. and CBR. CBR is a problem-solving technique and shows significant promise for improving the effectiveness of complex and unstructured decision making, and we can obtain excellent results through CBR in this study. Experimental results show that the classification accuracy of integration model using CBR is superior to logistic regression, artificial neural networks and SVM. When we apply the customer response model to predict respondents in the direct marketing, we have to consider from the view point of profit/cost about the misclassification.

  • PDF

Deep LS-SVM for regression

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.3
    • /
    • pp.827-833
    • /
    • 2016
  • In this paper, we propose a deep least squares support vector machine (LS-SVM) for regression problems, which consists of the input layer and the hidden layer. In the hidden layer, LS-SVMs are trained with the original input variables and the perturbed responses. For the final output, the main LS-SVM is trained with the outputs from LS-SVMs of the hidden layer as input variables and the original responses. In contrast to the multilayer neural network (MNN), LS-SVMs in the deep LS-SVM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep LS-SVM are entirely different from MNN in which all weights and biases are trained to minimize one final error function. When compared to MNN approaches, the deep LS-SVM does not make use of any combination weights, but trains all LS-SVMs in the architecture. Experimental results from real datasets illustrate that the deep LS-SVM significantly outperforms state of the art machine learning methods on regression problems.

Estimating Regression Function with $\varepsilon-Insensitive$ Supervised Learning Algorithm

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.2
    • /
    • pp.477-483
    • /
    • 2004
  • One of the major paradigms for supervised learning in neural network community is back-propagation learning. The standard implementations of back-propagation learning are optimal under the assumptions of identical and independent Gaussian noise. In this paper, for regression function estimation, we introduce $\varepsilon-insensitive$ back-propagation learning algorithm, which corresponds to minimizing the least absolute error. We compare this algorithm with support vector machine(SVM), which is another $\varepsilon-insensitive$ supervised learning algorithm and has been very successful in pattern recognition and function estimation problems. For comparison, we consider a more realistic model would allow the noise variance itself to depend on the input variables.

  • PDF

A Study on the Short-term Load Forecasting using Support Vector Machine (지원벡터머신을 이용한 단기전력 수요예측에 관한 연구)

  • Jo, Nam-Hoon;Song, Kyung-Bin;Roh, Young-Su;Kang, Dae-Seung
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.55 no.7
    • /
    • pp.306-312
    • /
    • 2006
  • Support Vector Machine(SVM), of which the foundations have been developed by Vapnik (1995), is gaining popularity thanks to many attractive features and promising empirical performance. In this paper, we propose a new short-term load forecasting technique based on SVM. We discuss the input vector selection of SVM for load forecasting and analyze the prediction performance for various SVM parameters such as kernel function, cost coefficient C, and $\varepsilon$ (the width of 8 $\varepsilon-tube$). The computer simulation shows that the prediction performance of the proposed method is superior to that of the conventional neural networks.

City Gas Pipeline Pressure Prediction Model (도시가스 배관압력 예측모델)

  • Chung, Won Hee;Park, Giljoo;Gu, Yeong Hyeon;Kim, Sunghyun;Yoo, Seong Joon;Jo, Young-do
    • The Journal of Society for e-Business Studies
    • /
    • v.23 no.2
    • /
    • pp.33-47
    • /
    • 2018
  • City gas pipelines are buried underground. Because of this, pipeline is hard to manage, and can be easily damaged. This research proposes a real time prediction system that helps experts can make decision about pressure anomalies. The gas pipline pressure data of Jungbu City Gas Company, which is one of the domestic city gas suppliers, time variables and environment variables are analysed. In this research, regression models that predicts pipeline pressure in minutes are proposed. Random forest, support vector regression (SVR), long-short term memory (LSTM) algorithms are used to build pressure prediction models. A comparison of pressure prediction models' preformances shows that the LSTM model was the best. LSTM model for Asan-si have root mean square error (RMSE) 0.011, mean absolute percentage error (MAPE) 0.494. LSTM model for Cheonan-si have RMSE 0.015, MAPE 0.668.

Comparison of MLR and SVR Based Linear and Nonlinear Regressions - Compensation for Wind Speed Prediction (MLR 및 SVR 기반 선형과 비선형회귀분석의 비교 - 풍속 예측 보정)

  • Kim, Junbong;Oh, Seungchul;Seo, Kisung
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.65 no.5
    • /
    • pp.851-856
    • /
    • 2016
  • Wind speed is heavily fluctuated and quite local than other weather elements. It is difficult to improve the accuracy of prediction only in a numerical prediction model. An MOS (Model Output Statistics) technique is used to correct the systematic errors of the model using a statistical data analysis. The Most of previous MOS has used a linear regression model for weather prediction, but it is hard to manage an irregular nature of prediction of wind speed. In order to solve the problem, a nonlinear regression method using SVR (Support Vector Regression) is introduced for a development of MOS for wind speed prediction. Experiments are performed for KLAPS (Korea Local Analysis and Prediction System) re-analysis data from 2007 to 2013 year for Jeju Island and Busan area in South Korea. The MLR and SVR based linear and nonlinear methods are compared to each other for prediction accuracy of wind speed. Also, the comparison experiments are executed for the variation in the number of UM elements.

Two Machine Learning Models for Mobile Phone Battery Discharge Rate Prediction Based on Usage Patterns

  • Chantrapornchai, Chantana;Nusawat, Paingruthai
    • Journal of Information Processing Systems
    • /
    • v.12 no.3
    • /
    • pp.436-454
    • /
    • 2016
  • This research presents the battery discharge rate models for the energy consumption of mobile phone batteries based on machine learning by taking into account three usage patterns of the phone: the standby state, video playing, and web browsing. We present the experimental design methodology for collecting data, preprocessing, model construction, and parameter selections. The data is collected based on the HTC One X hardware platform. We considered various setting factors, such as Bluetooth, brightness, 3G, GPS, Wi-Fi, and Sync. The battery levels for each possible state vector were measured, and then we constructed the battery prediction model using different regression functions based on the collected data. The accuracy of the constructed models using the multi-layer perceptron (MLP) and the support vector machine (SVM) were compared using varying kernel functions. Various parameters for MLP and SVM were considered. The measurement of prediction efficiency was done by the mean absolute error (MAE) and the root mean squared error (RMSE). The experiments showed that the MLP with linear regression performs well overall, while the SVM with the polynomial kernel function based on the linear regression gives a low MAE and RMSE. As a result, we were able to demonstrate how to apply the derived model to predict the remaining battery charge.