• Title/Summary/Keyword: Machine Learning Technology

Search Result 1,763, Processing Time 0.03 seconds

Review on Applications of Machine Learning in Coastal and Ocean Engineering

  • Kim, Taeyoon;Lee, Woo-Dong
    • Journal of Ocean Engineering and Technology
    • /
    • v.36 no.3
    • /
    • pp.194-210
    • /
    • 2022
  • Recently, an analysis method using machine learning for solving problems in coastal and ocean engineering has been highlighted. Machine learning models are effective modeling tools for predicting specific parameters by learning complex relationships based on a specified dataset. In coastal and ocean engineering, various studies have been conducted to predict dependent variables such as wave parameters, tides, storm surges, design parameters, and shoreline fluctuations. Herein, we introduce and describe the application trend of machine learning models in coastal and ocean engineering. Based on the results of various studies, machine learning models are an effective alternative to approaches involving data requirements, time-consuming fluid dynamics, and numerical models. In addition, machine learning can be successfully applied for solving various problems in coastal and ocean engineering. However, to achieve accurate predictions, model development should be conducted in addition to data preprocessing and cost calculation. Furthermore, applicability to various systems and quantifiable evaluations of uncertainty should be considered.

Selecting the Optimal Hidden Layer of Extreme Learning Machine Using Multiple Kernel Learning

  • Zhao, Wentao;Li, Pan;Liu, Qiang;Liu, Dan;Liu, Xinwang
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.12
    • /
    • pp.5765-5781
    • /
    • 2018
  • Extreme learning machine (ELM) is emerging as a powerful machine learning method in a variety of application scenarios due to its promising advantages of high accuracy, fast learning speed and easy of implementation. However, how to select the optimal hidden layer of ELM is still an open question in the ELM community. Basically, the number of hidden layer nodes is a sensitive hyperparameter that significantly affects the performance of ELM. To address this challenging problem, we propose to adopt multiple kernel learning (MKL) to design a multi-hidden-layer-kernel ELM (MHLK-ELM). Specifically, we first integrate kernel functions with random feature mapping of ELM to design a hidden-layer-kernel ELM (HLK-ELM), which serves as the base of MHLK-ELM. Then, we utilize the MKL method to propose two versions of MHLK-ELMs, called sparse and non-sparse MHLK-ELMs. Both two types of MHLK-ELMs can effectively find out the optimal linear combination of multiple HLK-ELMs for different classification and regression problems. Experimental results on seven data sets, among which three data sets are relevant to classification and four ones are relevant to regression, demonstrate that the proposed MHLK-ELM achieves superior performance compared with conventional ELM and basic HLK-ELM.

Android Malware Detection using Machine Learning Techniques KNN-SVM, DBN and GRU

  • Sk Heena Kauser;V.Maria Anu
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.7
    • /
    • pp.202-209
    • /
    • 2023
  • Android malware is now on the rise, because of the rising interest in the Android operating system. Machine learning models may be used to classify unknown Android malware utilizing characteristics gathered from the dynamic and static analysis of an Android applications. Anti-virus software simply searches for the signs of the virus instance in a specific programme to detect it while scanning. Anti-virus software that competes with it keeps these in large databases and examines each file for all existing virus and malware signatures. The proposed model aims to provide a machine learning method that depend on the malware detection method for Android inability to detect malware apps and improve phone users' security and privacy. This system tracks numerous permission-based characteristics and events collected from Android apps and analyses them using a classifier model to determine whether the program is good ware or malware. This method used the machine learning techniques KNN-SVM, DBN, and GRU in which help to find the accuracy which gives the different values like KNN gives 87.20 percents accuracy, SVM gives 91.40 accuracy, Naive Bayes gives 85.10 and DBN-GRU Gives 97.90. Furthermore, in this paper, we simply employ standard machine learning techniques; but, in future work, we will attempt to improve those machine learning algorithms in order to develop a better detection algorithm.

Limiting conditions prediction using machine learning for loss of condenser vacuum event

  • Dong-Hun Shin;Moon-Ghu Park;Hae-Yong Jeong;Jae-Yong Lee;Jung-Uk Sohn;Do-Yeon Kim
    • Nuclear Engineering and Technology
    • /
    • v.55 no.12
    • /
    • pp.4607-4616
    • /
    • 2023
  • We implement machine learning regression models to predict peak pressures of primary and secondary systems, a major safety concern in Loss Of Condenser Vacuum (LOCV) accident. We selected the Multi-dimensional Analysis of Reactor Safety-KINS standard (MARS-KS) code to analyze the LOCV accident, and the reference plant is the Korean Optimized Power Reactor 1000MWe (OPR1000). eXtreme Gradient Boosting (XGBoost) is selected as a machine learning tool. The MARS-KS code is used to generate LOCV accident data and the data is applied to train the machine learning model. Hyperparameter optimization is performed using a simulated annealing. The randomly generated combination of initial conditions within the operating range is put into the input of the XGBoost model to predict the peak pressure. These initial conditions that cause peak pressure with MARS-KS generate the results. After such a process, the error between the predicted value and the code output is calculated. Uncertainty about the machine learning model is also calculated to verify the model accuracy. The machine learning model presented in this paper successfully identifies a combination of initial conditions that produce a more conservative peak pressure than the values calculated with existing methodologies.

A Study on Comparison of Lung Cancer Prediction Using Ensemble Machine Learning

  • NAM, Yu-Jin;SHIN, Won-Ji
    • Korean Journal of Artificial Intelligence
    • /
    • v.7 no.2
    • /
    • pp.19-24
    • /
    • 2019
  • Lung cancer is a chronic disease which ranks fourth in cancer incidence with 11 percent of the total cancer incidence in Korea. To deal with such issues, there is an active study on the usefulness and utilization of the Clinical Decision Support System (CDSS) which utilizes machine learning. Thus, this study reviews existing studies on artificial intelligence technology that can be used in determining the lung cancer, and conducted a study on the applicability of machine learning in determination of the lung cancer by comparison and analysis using Azure ML provided by Microsoft. The results of this study show different predictions yielded by three algorithms: Support Vector Machine (SVM), Two-Class Support Decision Jungle and Multiclass Decision Jungle. This study has its limitations in the size of the Big data used in Machine Learning. Although the data provided by Kaggle is the most suitable one for this study, it is assumed that there is a limit in learning the data fully due to the lack of absolute figures. Therefore, it is claimed that if the agency's cooperation in the subsequent research is used to compare and analyze various kinds of algorithms other than those used in this study, a more accurate screening machine for lung cancer could be created.

Trend of Utilization of Machine Learning Technology for Digital Healthcare Data Analysis (디지털 헬스케어 데이터 분석을 위한 머신 러닝 기술 활용 동향)

  • Woo, Y.C.;Lee, S.Y.;Choi, W.;Ahn, C.W.;Baek, O.K.
    • Electronics and Telecommunications Trends
    • /
    • v.34 no.1
    • /
    • pp.98-110
    • /
    • 2019
  • Machine learning has been applied to medical imaging and has shown an excellent recognition rate. Recently, there has been much interest in preventive medicine. If data are accessible, machine learning packages can be used easily in digital healthcare fields. However, it is necessary to prepare the data in advance, and model evaluation and tuning are required to construct a reliable model. On average, these processes take more than 80% of the total effort required. In this study, we describe the basic concepts of machine learning, pre-processing and visualization of datasets, feature engineering for reliable models, model evaluation and tuning, and the latest trends in popular machine learning frameworks. Finally, we survey a explainable machine learning analysis tool and will discuss the future direction of machine learning.

Analysis of Open-Source Hyperparameter Optimization Software Trends

  • Lee, Yo-Seob;Moon, Phil-Joo
    • International Journal of Advanced Culture Technology
    • /
    • v.7 no.4
    • /
    • pp.56-62
    • /
    • 2019
  • Recently, research using artificial neural networks has further expanded the field of neural network optimization and automatic structuring from improving inference accuracy. The performance of the machine learning algorithm depends on how the hyperparameters are configured. Open-source hyperparameter optimization software can be an important step forward in improving the performance of machine learning algorithms. In this paper, we review open-source hyperparameter optimization softwares.

Single Antenna Based GPS Signal Reception Condition Classification Using Machine Learning Approaches

  • Sanghyun Kim;Seunghyeon Park;Jiwon Seo
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.12 no.2
    • /
    • pp.149-155
    • /
    • 2023
  • In urban areas it can be difficult to utilize global navigation satellite systems (GNSS) due to signal reflections and blockages. It is thus crucial to detect reflected or blocked signals because they lead to significant degradation of GNSS positioning accuracy. In a previous study, a classifier for global positioning system (GPS) signal reception conditions was developed using three features and the support vector machine (SVM) algorithm. However, this classifier had limitations in its classification performance. Therefore, in this study, we developed an improved machine learning based method of classifying GPS signal reception conditions by including an additional feature with the existing features. Furthermore, we applied various machine learning classification algorithms. As a result, when tested with datasets collected in different environments than the training environment, the classification accuracy improved by nine percentage points compared to the existing method, reaching up to 58%.

Guideline on Security Measures and Implementation of Power System Utilizing AI Technology (인공지능을 적용한 전력 시스템을 위한 보안 가이드라인)

  • Choi, Inji;Jang, Minhae;Choi, Moonsuk
    • KEPCO Journal on Electric Power and Energy
    • /
    • v.6 no.4
    • /
    • pp.399-404
    • /
    • 2020
  • There are many attempts to apply AI technology to diagnose facilities or improve the work efficiency of the power industry. The emergence of new machine learning technologies, such as deep learning, is accelerating the digital transformation of the power sector. The problem is that traditional power systems face security risks when adopting state-of-the-art AI systems. This adoption has convergence characteristics and reveals new cybersecurity threats and vulnerabilities to the power system. This paper deals with the security measures and implementations of the power system using machine learning. Through building a commercial facility operations forecasting system using machine learning technology utilizing power big data, this paper identifies and addresses security vulnerabilities that must compensated to protect customer information and power system safety. Furthermore, it provides security guidelines by generalizing security measures to be considered when applying AI.

Compact Modeling for Nanosheet FET Based on TCAD-Machine Learning (TCAD-머신러닝 기반 나노시트 FETs 컴팩트 모델링)

  • Junhyeok Song;Wonbok Lee;Jonghwan Lee
    • Journal of the Semiconductor & Display Technology
    • /
    • v.22 no.4
    • /
    • pp.136-141
    • /
    • 2023
  • The continuous shrinking of transistors in integrated circuits leads to difficulties in improving performance, resulting in the emerging transistors such as nanosheet field-effect transistors. In this paper, we propose a TCAD-machine learning framework of nanosheet FETs to model the current-voltage characteristics. Sentaurus TCAD simulations of nanosheet FETs are performed to obtain a large amount of device data. A machine learning model of I-V characteristics is trained using the multi-layer perceptron from these TCAD data. The weights and biases obtained from multi-layer perceptron are implemented in a PSPICE netlist to verify the accuracy of I-V and the DC transfer characteristics of a CMOS inverter. It is found that the proposed machine learning model is applicable to the prediction of nanosheet field-effect transistors device and circuit performance.

  • PDF