• Title/Summary/Keyword: Prediction Algorithms

Search Result 1,014, Processing Time 0.027 seconds

Performance Analysis of Deep Reinforcement Learning for Crop Yield Prediction (작물 생산량 예측을 위한 심층강화학습 성능 분석)

  • Ohnmar Khin;Sung-Keun Lee
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.18 no.1
    • /
    • pp.99-106
    • /
    • 2023
  • Recently, many studies on crop yield prediction using deep learning technology have been conducted. These algorithms have difficulty constructing a linear map between input data sets and crop prediction results. Furthermore, implementation of these algorithms positively depends on the rate of acquired attributes. Deep reinforcement learning can overcome these limitations. This paper analyzes the performance of DQN, Double DQN and Dueling DQN to improve crop yield prediction. The DQN algorithm retains the overestimation problem. Whereas, Double DQN declines the over-estimations and leads to getting better results. The proposed models achieves these by reducing the falsehood and increasing the prediction exactness.

Optimization of Device Process Parameters for GaAs-AlGaAs Multiple Quantum Well Avalanche Photodiodes Using Genetic Algorithms (유전 알고리즘을 이용한 다중 양자 우물 구조의 갈륨비소 광수신소자 공정변수의 최적화)

  • 김의승;오창훈;이서구;이봉용;이상렬;명재민;윤일구
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.14 no.3
    • /
    • pp.241-245
    • /
    • 2001
  • In this paper, we present parameter optimization technique for GaAs/AlGaAs multiple quantum well avalanche photodiodes used for image capture mechanism in high-definition system. Even under flawless environment in semiconductor manufacturing process, random variation in process parameters can bring the fluctuation to device performance. The precise modeling for this variation is thus required for accurate prediction of device performance. The precise modeling for this variation is thus required for accurate prediction of device performance. This paper will first use experimental design and neural networks to model the nonlinear relationship between device process parameters and device performance parameters. The derived model was then put into genetic algorithms to acquire optimized device process parameters. From the optimized technique, we can predict device performance before high-volume manufacturign, and also increase production efficiency.

  • PDF

Ensemble techniques and hybrid intelligence algorithms for shear strength prediction of squat reinforced concrete walls

  • Mohammad Sadegh Barkhordari;Leonardo M. Massone
    • Advances in Computational Design
    • /
    • v.8 no.1
    • /
    • pp.37-59
    • /
    • 2023
  • Squat reinforced concrete (SRC) shear walls are a critical part of the structure for both office/residential buildings and nuclear structures due to their significant role in withstanding seismic loads. Despite this, empirical formulae in current design standards and published studies demonstrate a considerable disparity in predicting SRC wall shear strength. The goal of this research is to develop and evaluate hybrid and ensemble artificial neural network (ANN) models. State-of-the-art population-based algorithms are used in this research for hybrid intelligence algorithms. Six models are developed, including Honey Badger Algorithm (HBA) with ANN (HBA-ANN), Hunger Games Search with ANN (HGS-ANN), fitness-distance balance coyote optimization algorithm (FDB-COA) with ANN (FDB-COA-ANN), Averaging Ensemble (AE) neural network, Snapshot Ensemble (SE) neural network, and Stacked Generalization (SG) ensemble neural network. A total of 434 test results of SRC walls is utilized to train and assess the models. The results reveal that the SG model not only minimizes prediction variance but also produces predictions (with R2= 0.99) that are superior to other models.

Performance analysis and comparison of various machine learning algorithms for early stroke prediction

  • Vinay Padimi;Venkata Sravan Telu;Devarani Devi Ningombam
    • ETRI Journal
    • /
    • v.45 no.6
    • /
    • pp.1007-1021
    • /
    • 2023
  • Stroke is the leading cause of permanent disability in adults, and it can cause permanent brain damage. According to the World Health Organization, 795 000 Americans experience a new or recurrent stroke each year. Early detection of medical disorders, for example, strokes, can minimize the disabling effects. Thus, in this paper, we consider various risk factors that contribute to the occurrence of stoke and machine learning algorithms, for example, the decision tree, random forest, and naive Bayes algorithms, on patient characteristics survey data to achieve high prediction accuracy. We also consider the semisupervised self-training technique to predict the risk of stroke. We then consider the near-miss undersampling technique, which can select only instances in larger classes with the smaller class instances. Experimental results demonstrate that the proposed method obtains an accuracy of approximately 98.83% at low cost, which is significantly higher and more reliable compared with the compared techniques.

A study on applying random forest and gradient boosting algorithm for Chl-a prediction of Daecheong lake (대청호 Chl-a 예측을 위한 random forest와 gradient boosting 알고리즘 적용 연구)

  • Lee, Sang-Min;Kim, Il-Kyu
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.35 no.6
    • /
    • pp.507-516
    • /
    • 2021
  • In this study, the machine learning which has been widely used in prediction algorithms recently was used. the research point was the CD(chudong) point which was a representative point of Daecheong Lake. Chlorophyll-a(Chl-a) concentration was used as a target variable for algae prediction. to predict the Chl-a concentration, a data set of water quality and quantity factors was consisted. we performed algorithms about random forest and gradient boosting with Python. to perform the algorithms, at first the correlation analysis between Chl-a and water quality and quantity data was studied. we extracted ten factors of high importance for water quality and quantity data. as a result of the algorithm performance index, the gradient boosting showed that RMSE was 2.72 mg/m3 and MSE was 7.40 mg/m3 and R2 was 0.66. as a result of the residual analysis, the analysis result of gradient boosting was excellent. as a result of the algorithm execution, the gradient boosting algorithm was excellent. the gradient boosting algorithm was also excellent with 2.44 mg/m3 of RMSE in the machine learning hyperparameter adjustment result.

Comparative Analysis of Machine Learning Algorithms for Healthy Management of Collaborative Robots (협동로봇의 건전성 관리를 위한 머신러닝 알고리즘의 비교 분석)

  • Kim, Jae-Eun;Jang, Gil-Sang;Lim, KuK-Hwa
    • Journal of the Korea Safety Management & Science
    • /
    • v.23 no.4
    • /
    • pp.93-104
    • /
    • 2021
  • In this paper, we propose a method for diagnosing overload and working load of collaborative robots through performance analysis of machine learning algorithms. To this end, an experiment was conducted to perform pick & place operation while changing the payload weight of a cooperative robot with a payload capacity of 10 kg. In this experiment, motor torque, position, and speed data generated from the robot controller were collected, and as a result of t-test and f-test, different characteristics were found for each weight based on a payload of 10 kg. In addition, to predict overload and working load from the collected data, machine learning algorithms such as Neural Network, Decision Tree, Random Forest, and Gradient Boosting models were used for experiments. As a result of the experiment, the neural network with more than 99.6% of explanatory power showed the best performance in prediction and classification. The practical contribution of the proposed study is that it suggests a method to collect data required for analysis from the robot without attaching additional sensors to the collaborative robot and the usefulness of a machine learning algorithm for diagnosing robot overload and working load.

Early Diagnosis of anxiety Disorder Using Artificial Intelligence

  • Choi DongOun;Huan-Meng;Yun-Jeong, Kang
    • International Journal of Advanced Culture Technology
    • /
    • v.12 no.1
    • /
    • pp.242-248
    • /
    • 2024
  • Contemporary societal and environmental transformations coincide with the emergence of novel mental health challenges. anxiety disorder, a chronic and highly debilitating illness, presents with diverse clinical manifestations. Epidemiological investigations indicate a global prevalence of 5%, with an additional 10% exhibiting subclinical symptoms. Notably, 9% of adolescents demonstrate clinical features. Untreated, anxiety disorder exerts profound detrimental effects on individuals, families, and the broader community. Therefore, it is very meaningful to predict anxiety disorder through machine learning algorithm analysis model. The main research content of this paper is the analysis of the prediction model of anxiety disorder by machine learning algorithms. The research purpose of machine learning algorithms is to use computers to simulate human learning activities. It is a method to locate existing knowledge, acquire new knowledge, continuously improve performance, and achieve self-improvement by learning computers. This article analyzes the relevant theories and characteristics of machine learning algorithms and integrates them into anxiety disorder prediction analysis. The final results of the study show that the AUC of the artificial neural network model is the largest, reaching 0.8255, indicating that it is better than the other two models in prediction accuracy. In terms of running time, the time of the three models is less than 1 second, which is within the acceptable range.

Software Vulnerability Prediction System Using Machine Learning Algorithm (기계학습 알고리즘을 이용한 소프트웨어 취약 여부 예측 시스템)

  • Choi, Minjun;Kim, Juhwan;Yun, Joobeom
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.3
    • /
    • pp.635-642
    • /
    • 2018
  • In the Era of the Fourth Industrial Revolution, we live in huge amounts of software. However, as software increases, software vulnerabilities are also increasing. Therefore, it is important to detect and remove software vulnerabilities. Currently, many researches have been studied to predict and detect software security problems, but it takes a long time to detect and does not have high prediction accuracy. Therefore, in this paper, we describe a method for efficiently predicting software vulnerabilities using machine learning algorithms. In addition, various machine learning algorithms are compared through experiments. Experimental results show that the k-nearest neighbors prediction model has the highest prediction rate.

GA-optimized Support Vector Regression for an Improved Emotional State Estimation Model

  • Ahn, Hyunchul;Kim, Seongjin;Kim, Jae Kyeong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.6
    • /
    • pp.2056-2069
    • /
    • 2014
  • In order to implement interactive and personalized Web services properly, it is necessary to understand the tangible and intangible responses of the users and to recognize their emotional states. Recently, some studies have attempted to build emotional state estimation models based on facial expressions. Most of these studies have applied multiple regression analysis (MRA), artificial neural network (ANN), and support vector regression (SVR) as the prediction algorithm, but the prediction accuracies have been relatively low. In order to improve the prediction performance of the emotion prediction model, we propose a novel SVR model that is optimized using a genetic algorithm (GA). Our proposed algorithm-GASVR-is designed to optimize the kernel parameters and the feature subsets of SVRs in order to predict the levels of two aspects-valence and arousal-of the emotions of the users. In order to validate the usefulness of GASVR, we collected a real-world data set of facial responses and emotional states via a survey. We applied GASVR and other algorithms including MRA, ANN, and conventional SVR to the data set. Finally, we found that GASVR outperformed all of the comparative algorithms in the prediction of the valence and arousal levels.

A Tunnel Ventilation Control Algorithm by Using CO Density Prediction Algorithm (일산화탄소 농도 예측 기능을 사용한 터널 환기 제어 알고리즘)

  • Han Doyoung;Yoon Jinwon
    • Korean Journal of Air-Conditioning and Refrigeration Engineering
    • /
    • v.16 no.11
    • /
    • pp.1035-1043
    • /
    • 2004
  • For a long road tunnel, a tunnel ventilation system may be used in order to reduce the pollution level below the required level. To control the tunnel pollution level, a closed loop control algorithm may be used. The feedforward prediction algorithm and the cascade control algorithm were developed to regulate the CO level in a tunnel. The feedforward prediction algorithm composed of the traffic estimation algorithm and the CO density prediction algorithm, and the cascade control algorithm composed of the jet fan control algorithm and the air velocity setpoint algorithm. The verification of control algorithms was carried out by dynamic models developed from the actual tunnel data. The simulation results showed that control algorithms developed for this study were effective for the control of the tunnel ventilation system.