• Title/Summary/Keyword: artificial intelligence-based model

Search Result 1,215, Processing Time 0.028 seconds

Systemic Analysis of Research Activities and Trends Related to Artificial Intelligence(A.I.) Technology Based on Latent Dirichlet Allocation (LDA) Model (Latent Dirichlet Allocation (LDA) 모델 기반의 인공지능(A.I.) 기술 관련 연구 활동 및 동향 분석)

  • Chung, Myoung Sug;Lee, Joo Yeoun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.23 no.3
    • /
    • pp.87-95
    • /
    • 2018
  • Recently, with the technological development of artificial intelligence, related market is expanding rapidly. In the artificial intelligence technology field, which is still in the early stage but still expanding, it is important to reduce uncertainty about research direction and investment field. Therefore, this study examined technology trends using text mining and topic modeling among big data analysis methods and suggested trends of core technology and future growth potential. We hope that the results of this study will provide researchers with an understanding of artificial intelligence technology trends and new implications for future research directions.

Study on Machine Learning Techniques for Malware Classification and Detection

  • Moon, Jaewoong;Kim, Subin;Song, Jaeseung;Kim, Kyungshin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.12
    • /
    • pp.4308-4325
    • /
    • 2021
  • The importance and necessity of artificial intelligence, particularly machine learning, has recently been emphasized. In fact, artificial intelligence, such as intelligent surveillance cameras and other security systems, is used to solve various problems or provide convenience, providing solutions to problems that humans traditionally had to manually deal with one at a time. Among them, information security is one of the domains where the use of artificial intelligence is especially needed because the frequency of occurrence and processing capacity of dangerous codes exceeds the capabilities of humans. Therefore, this study intends to examine the definition of artificial intelligence and machine learning, its execution method, process, learning algorithm, and cases of utilization in various domains, particularly the cases and contents of artificial intelligence technology used in the field of information security. Based on this, this study proposes a method to apply machine learning technology to the method of classifying and detecting malware that has rapidly increased in recent years. The proposed methodology converts software programs containing malicious codes into images and creates training data suitable for machine learning by preparing data and augmenting the dataset. The model trained using the images created in this manner is expected to be effective in classifying and detecting malware.

A study on Improving the Performance of Anti - Drone Systems using AI (인공지능(AI)을 활용한 드론방어체계 성능향상 방안에 관한 연구)

  • Hae Chul Ma;Jong Chan Moon;Jae Yong Park;Su Han Lee;Hyuk Jin Kwon
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.19 no.2
    • /
    • pp.126-134
    • /
    • 2023
  • Drones are emerging as a new security threat, and the world is working to reduce them. Detection and identification are the most difficult and important parts of the anti-drone systems. Existing detection and identification methods each have their strengths and weaknesses, so complementary operations are required. Detection and identification performance in anti-drone systems can be improved through the use of artificial intelligence. This is because artificial intelligence can quickly analyze differences smaller than humans. There are three ways to utilize artificial intelligence. Through reinforcement learning-based physical control, noise and blur generated when the optical camera tracks the drone may be reduced, and tracking stability may be improved. The latest NeRF algorithm can be used to solve the problem of lack of enemy drone data. It is necessary to build a data network to utilize artificial intelligence. Through this, data can be efficiently collected and managed. In addition, model performance can be improved by regularly generating artificial intelligence learning data.

Development of Artificial Intelligence Constitutive Equation Model Using Deep Learning (딥 러닝을 이용한 인공지능 구성방정식 모델의 개발)

  • Moon, H.B.;Kang, G.P.;Lee, K.;Kim, Y.H.
    • Transactions of Materials Processing
    • /
    • v.30 no.4
    • /
    • pp.186-194
    • /
    • 2021
  • Finite element simulation is a widely applied method for practical purpose in various metal forming process. However, in the simulation of elasto-plastic behavior of porous material or in crystal plasticity coupled multi-scale simulation, it requires much calculation time, which is a limitation in its application in practical situations. A machine learning model that directly outputs the constitutive equation without iterative calculations would greatly reduce the calculation time of the simulation. In this study, we examined the possibility of artificial intelligence based constitutive equation with the input of existing state variables and current velocity filed. To introduce the methodology, we described the process of obtaining the training data, machine learning process and the coupling of machine learning model with commercial software DEFROMTM, as a preliminary study, via rigid plastic finite element simulation.

Development of Predictive Model for Length of Stay(LOS) in Acute Stroke Patients using Artificial Intelligence (인공지능을 이용한 급성 뇌졸중 환자의 재원일수 예측모형 개발)

  • Choi, Byung Kwan;Ham, Seung Woo;Kim, Chok Hwan;Seo, Jung Sook;Park, Myung Hwa;Kang, Sung-Hong
    • Journal of Digital Convergence
    • /
    • v.16 no.1
    • /
    • pp.231-242
    • /
    • 2018
  • The efficient management of the Length of Stay(LOS) is important in hospital. It is import to reduce medical cost for patients and increase profitability for hospitals. In order to efficiently manage LOS, it is necessary to develop an artificial intelligence-based prediction model that supports hospitals in benchmarking and reduction ways of LOS. In order to develop a predictive model of LOS for acute stroke patients, acute stroke patients were extracted from 2013 and 2014 discharge injury patient data. The data for analysis was classified as 60% for training and 40% for evaluation. In the model development, we used traditional regression technique such as multiple regression analysis method, artificial intelligence technique such as interactive decision tree, neural network technique, and ensemble technique which integrate all. Model evaluation used Root ASE (Absolute error) index. They were 23.7 by multiple regression, 23.7 by interactive decision tree, 22.7 by neural network and 22.7 by esemble technique. As a result of model evaluation, neural network technique which is artificial intelligence technique was found to be superior. Through this, the utility of artificial intelligence has been proved in the development of the prediction LOS model. In the future, it is necessary to continue research on how to utilize artificial intelligence techniques more effectively in the development of LOS prediction model.

Development of Artificial Intelligence Model for Outlet Temperature of Vaporizer (기화 설비의 토출 온도 예측을 위한 인공지능 모델 개발)

  • Lee, Sang-Hyun;Cho, Gi-Jung;Shin, Jong-Ho
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.44 no.2
    • /
    • pp.85-92
    • /
    • 2021
  • Ambient Air Vaporizer (AAV) is an essential facility in the process of generating natural gas that uses air in the atmosphere as a medium for heat exchange to vaporize liquid natural gas into gas-state gas. AAV is more economical and eco-friendly in that it uses less energy compared to the previously used Submerged vaporizer (SMV) and Open-rack vaporizer (ORV). However, AAV is not often applied to actual processes because it is heavily affected by external environments such as atmospheric temperature and humidity. With insufficient operational experience and facility operations that rely on the intuition of the operator, the actual operation of AAV is very inefficient. To address these challenges, this paper proposes an artificial intelligence-based model that can intelligent AAV operations based on operational big data. The proposed artificial intelligence model is used deep neural networks, and the superiority of the artificial intelligence model is verified through multiple regression analysis and comparison. In this paper, the proposed model simulates based on data collected from real-world processes and compared to existing data, showing a 48.8% decrease in power usage compared to previous data. The techniques proposed in this paper can be used to improve the energy efficiency of the current natural gas generation process, and can be applied to other processes in the future.

Primary Study for dialogue based on Ordering Chatbot

  • Kim, Ji-Ho;Park, JongWon;Moon, Ji-Bum;Lee, Yulim;Yoon, Andy Kyung-yong
    • Journal of Multimedia Information System
    • /
    • v.5 no.3
    • /
    • pp.209-214
    • /
    • 2018
  • Today is the era of artificial intelligence. With the development of artificial intelligence, machines have begun to impersonate various human characteristics today. Chatbot is one instance of this interactive artificial intelligence. Chatbot is a computer program that enables to conduct natural conversations with people. As mentioned above, Chatbot conducted conversations in text, but Chatbot, in this study evolves to perform commands based on speech-recognition. In order for Chatbot to perfectly emulate a human dialogue, it is necessary to analyze the sentence correctly and extract appropriate response. To accomplish this, the sentence is classified into three types: objects, actions, and preferences. This study shows how objects is analyzed and processed, and also demonstrates the possibility of evolving from an elementary model to an advanced intelligent system. By this study, it will be evaluated that speech-recognition based Chatbot have improved order-processing time efficiency compared to text based Chatbot. Once this study is done, speech-recognition based Chatbot have the potential to automate customer service and reduce human effort.

CRFNet: Context ReFinement Network used for semantic segmentation

  • Taeghyun An;Jungyu Kang;Dooseop Choi;Kyoung-Wook Min
    • ETRI Journal
    • /
    • v.45 no.5
    • /
    • pp.822-835
    • /
    • 2023
  • Recent semantic segmentation frameworks usually combine low-level and high-level context information to achieve improved performance. In addition, postlevel context information is also considered. In this study, we present a Context ReFinement Network (CRFNet) and its training method to improve the semantic predictions of segmentation models of the encoder-decoder structure. Our study is based on postprocessing, which directly considers the relationship between spatially neighboring pixels of a label map, such as Markov and conditional random fields. CRFNet comprises two modules: a refiner and a combiner that, respectively, refine the context information from the output features of the conventional semantic segmentation network model and combine the refined features with the intermediate features from the decoding process of the segmentation model to produce the final output. To train CRFNet to refine the semantic predictions more accurately, we proposed a sequential training scheme. Using various backbone networks (ENet, ERFNet, and HyperSeg), we extensively evaluated our model on three large-scale, real-world datasets to demonstrate the effectiveness of our approach.

A Study on the Development of a Chatbot Using Generative AI to Provide Diets for Diabetic Patients

  • Ha-eun LEE;Jun Woo CHOI;Sung Lyul PARK;Min Soo KANG
    • Korean Journal of Artificial Intelligence
    • /
    • v.12 no.3
    • /
    • pp.25-31
    • /
    • 2024
  • The purpose of this study is to develop a sophisticated web-based artificial intelligence chatbot system designed to provide personalized dietary service for diabetic patients. According to a 2022 study, the prevalence of diabetes among individuals over 30 years old was 15.6% in 2020, identifying it as a significant societal issue with an increasing patient population. This study uses generative AI algorithms to tailor dietary recommendations for the elderly and various social classes, contributing to the maintenance of healthy eating habits and disease prevention. Through meticulous fine-tuning, the learning loss of the AI model was significantly reduced, nearing zero, demonstrating the chatbot's potential to offer precise dietary suggestions based on calorie intake and seasonal variations. As this technology adapts to diverse health conditions, ongoing research is crucial to enhance the accessibility of dietary information for the elderly, thereby promoting healthy eating practices and supporting disease prevention.

Time-Series Estimation based AI Algorithm for Energy Management in a Virtual Power Plant System

  • Yeonwoo LEE
    • Korean Journal of Artificial Intelligence
    • /
    • v.12 no.1
    • /
    • pp.17-24
    • /
    • 2024
  • This paper introduces a novel approach to time-series estimation for energy load forecasting within Virtual Power Plant (VPP) systems, leveraging advanced artificial intelligence (AI) algorithms, namely Long Short-Term Memory (LSTM) and Seasonal Autoregressive Integrated Moving Average (SARIMA). Virtual power plants, which integrate diverse microgrids managed by Energy Management Systems (EMS), require precise forecasting techniques to balance energy supply and demand efficiently. The paper introduces a hybrid-method forecasting model combining a parametric-based statistical technique and an AI algorithm. The LSTM algorithm is particularly employed to discern pattern correlations over fixed intervals, crucial for predicting accurate future energy loads. SARIMA is applied to generate time-series forecasts, accounting for non-stationary and seasonal variations. The forecasting model incorporates a broad spectrum of distributed energy resources, including renewable energy sources and conventional power plants. Data spanning a decade, sourced from the Korea Power Exchange (KPX) Electrical Power Statistical Information System (EPSIS), were utilized to validate the model. The proposed hybrid LSTM-SARIMA model with parameter sets (1, 1, 1, 12) and (2, 1, 1, 12) demonstrated a high fidelity to the actual observed data. Thus, it is concluded that the optimized system notably surpasses traditional forecasting methods, indicating that this model offers a viable solution for EMS to enhance short-term load forecasting.