• Title/Summary/Keyword: GBDT algorithm

Search Result 5, Processing Time 0.023 seconds

In-situ stresses ring hole measurement of concrete optimized based on finite element and GBDT algorithm

  • Chen Guo;Zheng Yang;Yanchao Yue;Wenxiao Li;Hantao Wu
    • Computers and Concrete
    • /
    • v.34 no.4
    • /
    • pp.477-487
    • /
    • 2024
  • The in-situ stresses of concrete are an essential index for assessing the safety performance of concrete structures. Conventional methods for pore pressure release often face challenges in selecting drilling ring parameters, uncontrollable stress release, and unstable detection accuracy. In this paper, the parameters affecting the results of the concrete ring hole stress release method are cross-combined, and finite elements are used to simulate the combined parameters and extract the stress release values to establish a training set. The GridSearchCV function is utilized to determine the optimal hyperparameters. The mean absolute error (MAE), root mean square error (RMSE), and coefficient of determination (R2) are used as evaluation indexes to train the gradient boosting decision tree (GBDT) algorithm, and the other three common algorithms are compared. The RMSE of the GBDT algorithm for the test set is 4.499, and the R2 of the GBDT algorithm for the test set is 0.962, which is 9.66% higher than the R2 of the best-performing comparison algorithm. The model generated by the GBDT algorithm can accurately calculate the concrete in-situ stresses based on the drilling ring parameters and the corresponding stress release values and has a high accuracy and generalization ability.

Comparative characteristic of ensemble machine learning and deep learning models for turbidity prediction in a river (딥러닝과 앙상블 머신러닝 모형의 하천 탁도 예측 특성 비교 연구)

  • Park, Jungsu
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.35 no.1
    • /
    • pp.83-91
    • /
    • 2021
  • The increased turbidity in rivers during flood events has various effects on water environmental management, including drinking water supply systems. Thus, prediction of turbid water is essential for water environmental management. Recently, various advanced machine learning algorithms have been increasingly used in water environmental management. Ensemble machine learning algorithms such as random forest (RF) and gradient boosting decision tree (GBDT) are some of the most popular machine learning algorithms used for water environmental management, along with deep learning algorithms such as recurrent neural networks. In this study GBDT, an ensemble machine learning algorithm, and gated recurrent unit (GRU), a recurrent neural networks algorithm, are used for model development to predict turbidity in a river. The observation frequencies of input data used for the model were 2, 4, 8, 24, 48, 120 and 168 h. The root-mean-square error-observations standard deviation ratio (RSR) of GRU and GBDT ranges between 0.182~0.766 and 0.400~0.683, respectively. Both models show similar prediction accuracy with RSR of 0.682 for GRU and 0.683 for GBDT. The GRU shows better prediction accuracy when the observation frequency is relatively short (i.e., 2, 4, and 8 h) where GBDT shows better prediction accuracy when the observation frequency is relatively long (i.e. 48, 120, 160 h). The results suggest that the characteristics of input data should be considered to develop an appropriate model to predict turbidity.

A Model Stacking Algorithm for Indoor Positioning System using WiFi Fingerprinting

  • JinQuan Wang;YiJun Wang;GuangWen Liu;GuiFen Chen
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.4
    • /
    • pp.1200-1215
    • /
    • 2023
  • With the development of IoT and artificial intelligence, location-based services are getting more and more attention. For solving the current problem that indoor positioning error is large and generalization is poor, this paper proposes a Model Stacking Algorithm for Indoor Positioning System using WiFi fingerprinting. Firstly, we adopt a model stacking method based on Bayesian optimization to predict the location of indoor targets to improve indoor localization accuracy and model generalization. Secondly, Taking the predicted position based on model stacking as the observation value of particle filter, collaborative particle filter localization based on model stacking algorithm is realized. The experimental results show that the algorithm can control the position error within 2m, which is superior to KNN, GBDT, Xgboost, LightGBM, RF. The location accuracy of the fusion particle filter algorithm is improved by 31%, and the predicted trajectory is close to the real trajectory. The algorithm can also adapt to the application scenarios with fewer wireless access points.

Real-time prediction on the slurry concentration of cutter suction dredgers using an ensemble learning algorithm

  • Han, Shuai;Li, Mingchao;Li, Heng;Tian, Huijing;Qin, Liang;Li, Jinfeng
    • International conference on construction engineering and project management
    • /
    • 2020.12a
    • /
    • pp.463-481
    • /
    • 2020
  • Cutter suction dredgers (CSDs) are widely used in various dredging constructions such as channel excavation, wharf construction, and reef construction. During a CSD construction, the main operation is to control the swing speed of cutter to keep the slurry concentration in a proper range. However, the slurry concentration cannot be monitored in real-time, i.e., there is a "time-lag effect" in the log of slurry concentration, making it difficult for operators to make the optimal decision on controlling. Concerning this issue, a solution scheme that using real-time monitored indicators to predict current slurry concentration is proposed in this research. The characteristics of the CSD monitoring data are first studied, and a set of preprocessing methods are presented. Then we put forward the concept of "index class" to select the important indices. Finally, an ensemble learning algorithm is set up to fit the relationship between the slurry concentration and the indices of the index classes. In the experiment, log data over seven days of a practical dredging construction is collected. For comparison, the Deep Neural Network (DNN), Long Short Time Memory (LSTM), Support Vector Machine (SVM), Random Forest (RF), Gradient Boosting Decision Tree (GBDT), and the Bayesian Ridge algorithm are tried. The results show that our method has the best performance with an R2 of 0.886 and a mean square error (MSE) of 5.538. This research provides an effective way for real-time predicting the slurry concentration of CSDs and can help to improve the stationarity and production efficiency of dredging construction.

  • PDF

Effect of input variable characteristics on the performance of an ensemble machine learning model for algal bloom prediction (앙상블 머신러닝 모형을 이용한 하천 녹조발생 예측모형의 입력변수 특성에 따른 성능 영향)

  • Kang, Byeong-Koo;Park, Jungsu
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.35 no.6
    • /
    • pp.417-424
    • /
    • 2021
  • Algal bloom is an ongoing issue in the management of freshwater systems for drinking water supply, and the chlorophyll-a concentration is commonly used to represent the status of algal bloom. Thus, the prediction of chlorophyll-a concentration is essential for the proper management of water quality. However, the chlorophyll-a concentration is affected by various water quality and environmental factors, so the prediction of its concentration is not an easy task. In recent years, many advanced machine learning algorithms have increasingly been used for the development of surrogate models to prediction the chlorophyll-a concentration in freshwater systems such as rivers or reservoirs. This study used a light gradient boosting machine(LightGBM), a gradient boosting decision tree algorithm, to develop an ensemble machine learning model to predict chlorophyll-a concentration. The field water quality data observed at Daecheong Lake, obtained from the real-time water information system in Korea, were used for the development of the model. The data include temperature, pH, electric conductivity, dissolved oxygen, total organic carbon, total nitrogen, total phosphorus, and chlorophyll-a. First, a LightGBM model was developed to predict the chlorophyll-a concentration by using the other seven items as independent input variables. Second, the time-lagged values of all the input variables were added as input variables to understand the effect of time lag of input variables on model performance. The time lag (i) ranges from 1 to 50 days. The model performance was evaluated using three indices, root mean squared error-observation standard deviation ration (RSR), Nash-Sutcliffe coefficient of efficiency (NSE) and mean absolute error (MAE). The model showed the best performance by adding a dataset with a one-day time lag (i=1) where RSR, NSE, and MAE were 0.359, 0.871 and 1.510, respectively. The improvement of model performance was observed when a dataset with a time lag up of about 15 days (i=15) was added.