• Title/Summary/Keyword: MachineLearning

Search Result 5,657, Processing Time 0.03 seconds

Prediction of compressive strength of sustainable concrete using machine learning tools

  • Lokesh Choudhary;Vaishali Sahu;Archanaa Dongre;Aman Garg
    • Computers and Concrete
    • /
    • v.33 no.2
    • /
    • pp.137-145
    • /
    • 2024
  • The technique of experimentally determining concrete's compressive strength for a given mix design is time-consuming and difficult. The goal of the current work is to propose a best working predictive model based on different machine learning algorithms such as Gradient Boosting Machine (GBM), Stacked Ensemble (SE), Distributed Random Forest (DRF), Extremely Randomized Trees (XRT), Generalized Linear Model (GLM), and Deep Learning (DL) that can forecast the compressive strength of ternary geopolymer concrete mix without carrying out any experimental procedure. A geopolymer mix uses supplementary cementitious materials obtained as industrial by-products instead of cement. The input variables used for assessing the best machine learning algorithm not only include individual ingredient quantities, but molarity of the alkali activator and age of testing as well. Myriad statistical parameters used to measure the effectiveness of the models in forecasting the compressive strength of ternary geopolymer concrete mix, it has been found that GBM performs better than all other algorithms. A sensitivity analysis carried out towards the end of the study suggests that GBM model predicts results close to the experimental conditions with an accuracy between 95.6 % to 98.2 % for testing and training datasets.

A Study on Machine Learning Compiler and Modulo Scheduler (머신러닝 컴파일러와 모듈로 스케쥴러에 관한 연구)

  • Doosan Cho
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.27 no.1
    • /
    • pp.87-95
    • /
    • 2024
  • This study is on modulo scheduling algorithms for multicore processor in machine learning applications. Machine learning algorithms are designed to perform a large amount of operations such as vectors and matrices in order to quickly process large amounts of data stream. To support such large amounts of computations, processor architectures to support applications such as artificial intelligence, neural networks, and machine learning are designed in the form of parallel processing such as multicore. To effectively utilize these multi-core hardware resources, various compiler techniques are being used and studied. In this study, among these compiler techniques, we analyzed the modular scheduler, which is especially important in one core's computation pipeline. This paper looked at and compared the iterative modular scheduler and the swing modular scheduler, which are the most widely used and studied. As a result, both schedulers provided similar performance results, and when measuring register pressure as an indicator, it was confirmed that the swing modulo scheduler provided slightly better performance. In this study, a technique that divides recurrence edge is proposed to improve the minimum initiation interval of the modulo schedulers.

A Prediction Triage System for Emergency Department During Hajj Period using Machine Learning Models

  • Huda N. Alhazmi
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.7
    • /
    • pp.11-23
    • /
    • 2024
  • Triage is a practice of accurately prioritizing patients in emergency department (ED) based on their medical condition to provide them with proper treatment service. The variation in triage assessment among medical staff can cause mis-triage which affect the patients negatively. Developing ED triage system based on machine learning (ML) techniques can lead to accurate and efficient triage outcomes. This study aspires to develop a triage system using machine learning techniques to predict ED triage levels using patients' information. We conducted a retrospective study using Security Forces Hospital ED data, from 2021 through 2023 during Hajj period in Saudia Arabi. Using demographics, vital signs, and chief complaints as predictors, two machine learning models were investigated, naming gradient boosted decision tree (XGB) and deep neural network (DNN). The models were trained to predict ED triage levels and their predictive performance was evaluated using area under the receiver operating characteristic curve (AUC) and confusion matrix. A total of 11,584 ED visits were collected and used in this study. XGB and DNN models exhibit high abilities in the predicting performance with AUC-ROC scores 0.85 and 0.82, respectively. Compared to the traditional approach, our proposed system demonstrated better performance and can be implemented in real-world clinical settings. Utilizing ML applications can power the triage decision-making, clinical care, and resource utilization.

Analysis of Female Lower Body Shapes for the Development of Slacks Patterns: Exploring Body Clusters Using Machine Learning

  • Ji Min Kim
    • International Journal of Advanced Culture Technology
    • /
    • v.12 no.3
    • /
    • pp.434-440
    • /
    • 2024
  • SIZE KOREA updates body measurement data every five years, providing essential information for the fashion industry. This anthropometric data is widely used to diagnose consumer body shapes and develop optimal clothing sizes. Artificial intelligence, particularly machine learning, excels in predicting such body shape classifications. This study seeks to enhance the suitability of clothing design by applying the new analytical methodology of machine learning techniques to better capture and classify the unique body shapes of Korean women. In this study, machine learning techniques such as K-means clustering, Silhouette analysis, and Decision Tree analysis were used to classify the lower body shapes of Korean women in their twenties and identify standard body shapes useful for slacks design. The results showed that the lower body of the age group could be classified into three categories: 'small stature' (the majority), 'tall with an average lower body volume,' and 'medium height with a fuller lower body' (the smallest share). The three-cluster approach is validated through Silhouette analysis, which minimizes misclassification. Decision Tree analysis then further defines the criteria for these clusters, highlighting waist height and hip depth as the most significant factors, achieving a classification accuracy of 90.6%. While this study is not directly related to Robotic Process Automation, its detailed analysis of body shapes for slacks patterns can aid RPA in clothing production. Future research should continue integrating machine learning in human body and fashion design studies.

Estimation of Software Reliability with Immune Algorithm and Support Vector Regression (면역 알고리즘 기반의 서포트 벡터 회귀를 이용한 소프트웨어 신뢰도 추정)

  • Kwon, Ki-Tae;Lee, Joon-Kil
    • Journal of Information Technology Services
    • /
    • v.8 no.4
    • /
    • pp.129-140
    • /
    • 2009
  • The accurate estimation of software reliability is important to a successful development in software engineering. Until recent days, the models using regression analysis based on statistical algorithm and machine learning method have been used. However, this paper estimates the software reliability using support vector regression, a sort of machine learning technique. Also, it finds the best set of optimized parameters applying immune algorithm, changing the number of generations, memory cells, and allele. The proposed IA-SVR model outperforms some recent results reported in the literature.

BEGINNER'S GUIDE TO NEURAL NETWORKS FOR THE MNIST DATASET USING MATLAB

  • Kim, Bitna;Park, Young Ho
    • Korean Journal of Mathematics
    • /
    • v.26 no.2
    • /
    • pp.337-348
    • /
    • 2018
  • MNIST dataset is a database containing images of handwritten digits, with each image labeled by an integer from 0 to 9. It is used to benchmark the performance of machine learning algorithms. Neural networks for MNIST are regarded as the starting point of the studying machine learning algorithms. However it is not easy to start the actual programming. In this expository article, we will give a step-by-step instruction to build neural networks for MNIST dataset using MATLAB.

A Kernel Approach to Discriminant Analysis for Binary Classification

  • Shin, Yang-Kyu
    • Journal of the Korean Data and Information Science Society
    • /
    • v.12 no.2
    • /
    • pp.83-93
    • /
    • 2001
  • We investigate a kernel approach to discriminant analysis for binary classification as a machine learning point of view. Our view of the kernel approach follows support vector method which is one of the most promising techniques in the area of machine learning. As usual discriminant analysis, the kernel method can discriminate an object most likely belongs to. Moreover, it has some advantage over discriminant analysis such as data compression and computing time.

  • PDF

Machine Learning in FET-based Chemical and Biological Sensors: A Mini Review

  • Ahn, Jae-Hyuk
    • Journal of Sensor Science and Technology
    • /
    • v.30 no.1
    • /
    • pp.1-9
    • /
    • 2021
  • This mini review summarizes some of the recent advances in machine-learning (ML)-driven chemical and biological sensors. Specific focus is on field-effect-transistor (FET)-based sensors with a description of their structures and detection mechanisms. Key ML techniques are briefly reviewed for an audience not familiar with the basic principles. We mainly discuss two aspects: (1) data analysis based on ML and (2) ML applied to sensor design. In conclusion, the challenges and opportunities for the advancement of ML-based sensors are briefly considered.