• Title/Summary/Keyword: machine learning applications

Search Result 538, Processing Time 0.027 seconds

A Study on the Development of Electronic Mail-based Customer Relationship Management System (전자메일 기반의 고객관계관리(CRM) 시스템 개발에 관한 연구)

  • 김승욱;양광민
    • Journal of Information Technology Applications and Management
    • /
    • v.10 no.4
    • /
    • pp.51-63
    • /
    • 2003
  • This study designs and implements a new approach to the classification of e-mail requests from customer based on machine learning techniques. The work on building an electronic mall classifier can be cast into the framework of text classification, since an e-mail is a viewed as a document, and judgement of interest is viewed as a class level given to the e-mail document. It is also implemented an e-mall based automated response system that integrate with Call Center in a practical use.

  • PDF

Analysis of Cloud Service Providers

  • Lee, Yo-Seob
    • International Journal of Advanced Culture Technology
    • /
    • v.9 no.3
    • /
    • pp.315-320
    • /
    • 2021
  • Currently, cloud computing is being used as a technology that greatly changes the IT field. For many businesses, many cloud services are available in the form of custom, reliable, and cost-effective web applications. Most cloud service providers provide functions such as IoT, machine learning, AI services, blockchain, AR & VR, mobile services, and containers in addition to basic cloud services that support the scalability of processors, memory, and storage. In this paper, we will look at the most used cloud service providers and compare the services provided by the cloud service providers.

Feature Selection for Anomaly Detection Based on Genetic Algorithm (유전 알고리즘 기반의 비정상 행위 탐지를 위한 특징선택)

  • Seo, Jae-Hyun
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.7
    • /
    • pp.1-7
    • /
    • 2018
  • Feature selection, one of data preprocessing techniques, is one of major research areas in many applications dealing with large dataset. It has been used in pattern recognition, machine learning and data mining, and is now widely applied in a variety of fields such as text classification, image retrieval, intrusion detection and genome analysis. The proposed method is based on a genetic algorithm which is one of meta-heuristic algorithms. There are two methods of finding feature subsets: a filter method and a wrapper method. In this study, we use a wrapper method, which evaluates feature subsets using a real classifier, to find an optimal feature subset. The training dataset used in the experiment has a severe class imbalance and it is difficult to improve classification performance for rare classes. After preprocessing the training dataset with SMOTE, we select features and evaluate them with various machine learning algorithms.

An Experimental Evaluation of Box office Revenue Prediction through Social Bigdata Analysis and Machine Learning (소셜 빅데이터 분석과 기계학습을 이용한 영화흥행예측 기법의 실험적 평가)

  • Chang, Jae-Young
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.17 no.3
    • /
    • pp.167-173
    • /
    • 2017
  • With increased interest in the fourth industrial revolution represented by artificial intelligence, it has been very active to utilize bigdata and machine learning techniques in almost areas of society. Also, such activities have been realized by development of forecasting systems in various applications. Especially in the movie industry, there have been numerous attempts to predict whether they would be success or not. In the past, most of studies considered only the static factors in the process of prediction, but recently, several efforts are tried to utilize realtime social bigdata produced in SNS. In this paper, we propose the prediction technique utilizing various feedback information such as news articles, blogs and reviews as well as static factors of movies. Additionally, we also experimentally evaluate whether the proposed technique could precisely forecast their revenue targeting on the relatively successful movies.

Applications of Machine Learning Models for the Estimation of Reservoir CO2 Emissions (저수지 CO2 배출량 산정을 위한 기계학습 모델의 적용)

  • Yoo, Jisu;Chung, Se-Woong;Park, Hyung-Seok
    • Journal of Korean Society on Water Environment
    • /
    • v.33 no.3
    • /
    • pp.326-333
    • /
    • 2017
  • The lakes and reservoirs have been reported as important sources of carbon emissions to the atmosphere in many countries. Although field experiments and theoretical investigations based on the fundamental gas exchange theory have proposed the quantitative amounts of Net Atmospheric Flux (NAF) in various climate regions, there are still large uncertainties at the global scale estimation. Mechanistic models can be used for understanding and estimating the temporal and spatial variations of the NAFs considering complicated hydrodynamic and biogeochemical processes in a reservoir, but these models require extensive and expensive datasets and model parameters. On the other hand, data driven machine learning (ML) algorithms are likely to be alternative tools to estimate the NAFs in responding to independent environmental variables. The objective of this study was to develop random forest (RF) and multi-layer artificial neural network (ANN) models for the estimation of the daily $CO_2$ NAFs in Daecheong Reservoir located in Geum River of Korea, and compare the models performance against the multiple linear regression (MLR) model that proposed in the previous study (Chung et al., 2016). As a result, the RF and ANN models showed much enhanced performance in the estimation of the high NAF values, while MLR model significantly under estimated them. Across validation with 10-fold random samplings was applied to evaluate the performance of three models, and indicated that the ANN model is best, and followed by RF and MLR models.

Reconstruction of High-Resolution Facial Image Based on Recursive Error Back-Projection of Top-Down Machine Learning (하향식 기계학습의 반복적 오차 역투영에 기반한 고해상도 얼굴 영상의 복원)

  • Park, Jeong-Seon;Lee, Seong-Whan
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.3
    • /
    • pp.266-274
    • /
    • 2007
  • This paper proposes a new reconstruction method of high-resolution facial image from a low-resolution facial image based on top-down machine learning and recursive error back-projection. A face is represented by a linear combination of prototypes of shape and that of texture. With the shape and texture information of each pixel in a given low-resolution facial image, we can estimate optimal coefficients for a linear combination of prototypes of shape and those that of texture by solving least square minimizations. Then high-resolution facial image can be obtained by using the optimal coefficients for linear combination of the high-resolution prototypes. In addition, a recursive error back-projection procedure is applied to improve the reconstruction accuracy of high-resolution facial image. The encouraging results of the proposed method show that our method can be used to improve the performance of the face recognition by applying our method to reconstruct high-resolution facial images from low-resolution images captured at a distance.

Characterization and Classification of Pores in Metal 3D Printing Materials with X-ray Tomography and Machine Learning (X-ray tomography 분석과 기계 학습을 활용한 금속 3D 프린팅 소재 내의 기공 형태 분류)

  • Kim, Eun-Ah;Kwon, Se-Hun;Yang, Dong-Yeol;Yu, Ji-Hun;Kim, Kwon-Ill;Lee, Hak-Sung
    • Journal of Powder Materials
    • /
    • v.28 no.3
    • /
    • pp.208-215
    • /
    • 2021
  • Metal three-dimensional (3D) printing is an important emerging processing method in powder metallurgy. There are many successful applications of additive manufacturing. However, processing parameters such as laser power and scan speed must be manually optimized despite the development of artificial intelligence. Automatic calibration using information in an additive manufacturing database is desirable. In this study, 15 commercial pure titanium samples are processed under different conditions, and the 3D pore structures are characterized by X-ray tomography. These samples are easily classified into three categories, unmelted, well melted, or overmelted, depending on the laser energy density. Using more than 10,000 projected images for each category, convolutional neural networks are applied, and almost perfect classification of these samples is obtained. This result demonstrates that machine learning methods based on X-ray tomography can be helpful to automatically identify more suitable processing parameters.

Application of Big Data and Machine-learning (ML) Technology to Mitigate Contractor's Design Risks for Engineering, Procurement, and Construction (EPC) Projects

  • Choi, Seong-Jun;Choi, So-Won;Park, Min-Ji;Lee, Eul-Bum
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.823-830
    • /
    • 2022
  • The risk of project execution increases due to the enlargement and complexity of Engineering, Procurement, and Construction (EPC) plant projects. In the fourth industrial revolution era, there is an increasing need to utilize a large amount of data generated during project execution. The design is a key element for the success of the EPC plant project. Although the design cost is about 5% of the total EPC project cost, it is a critical process that affects the entire subsequent process, such as construction, installation, and operation & maintenance (O&M). This study aims to develop a system using machine-learning (ML) techniques to predict risks and support decision-making based on big data generated in an EPC project's design and construction stages. As a result, three main modules were developed: (M1) the design cost estimation module, (M2) the design error check module, and (M3) the change order forecasting module. M1 estimated design cost based on project data such as contract amount, construction period, total design cost, and man-hour (M/H). M2 and M3 are applications for predicting the severity of schedule delay and cost over-run due to design errors and change orders through unstructured text data extracted from engineering documents. A validation test was performed through a case study to verify the model applied to each module. It is expected to improve the risk response capability of EPC contractors in the design and construction stage through this study.

  • PDF

Development of a Framework for Improvement of Sensor Data Quality from Weather Buoys (해양기상부표의 센서 데이터 품질 향상을 위한 프레임워크 개발)

  • Ju-Yong Lee;Jae-Young Lee;Jiwoo Lee;Sangmun Shin;Jun-hyuk Jang;Jun-Hee Han
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.46 no.3
    • /
    • pp.186-197
    • /
    • 2023
  • In this study, we focus on the improvement of data quality transmitted from a weather buoy that guides a route of ships. The buoy has an Internet-of-Thing (IoT) including sensors to collect meteorological data and the buoy's status, and it also has a wireless communication device to send them to the central database in a ground control center and ships nearby. The time interval of data collected by the sensor is irregular, and fault data is often detected. Therefore, this study provides a framework to improve data quality using machine learning models. The normal data pattern is trained by machine learning models, and the trained models detect the fault data from the collected data set of the sensor and adjust them. For determining fault data, interquartile range (IQR) removes the value outside the outlier, and an NGBoost algorithm removes the data above the upper bound and below the lower bound. The removed data is interpolated using NGBoost or long-short term memory (LSTM) algorithm. The performance of the suggested process is evaluated by actual weather buoy data from Korea to improve the quality of 'AIR_TEMPERATURE' data by using other data from the same buoy. The performance of our proposed framework has been validated through computational experiments based on real-world data, confirming its suitability for practical applications in real-world scenarios.

Intelligent Android Malware Detection Using Radial Basis Function Networks and Permission Features

  • Abdulrahman, Ammar;Hashem, Khalid;Adnan, Gaze;Ali, Waleed
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.6
    • /
    • pp.286-293
    • /
    • 2021
  • Recently, the quick development rate of apps in the Android platform has led to an accelerated increment in creating malware applications by cyber attackers. Numerous Android malware detection tools have utilized conventional signature-based approaches to detect malware apps. However, these conventional strategies can't identify the latest apps on whether applications are malware or not. Many new malware apps are periodically discovered but not all malware Apps can be accurately detected. Hence, there is a need to propose intelligent approaches that are able to detect the newly developed Android malware applications. In this study, Radial Basis Function (RBF) networks are trained using known Android applications and then used to detect the latest and new Android malware applications. Initially, the optimal permission features of Android apps are selected using Information Gain Ratio (IGR). Appropriately, the features selected by IGR are utilized to train the RBF networks in order to detect effectively the new Android malware apps. The empirical results showed that RBF achieved the best detection accuracy (97.20%) among other common machine learning techniques. Furthermore, RBF accomplished the best detection results in most of the other measures.