• Title/Summary/Keyword: artificial intelligence-based models

Search Result 575, Processing Time 0.025 seconds

Prediction Model of Real Estate ROI with the LSTM Model based on AI and Bigdata

  • Lee, Jeong-hyun;Kim, Hoo-bin;Shim, Gyo-eon
    • International journal of advanced smart convergence
    • /
    • v.11 no.1
    • /
    • pp.19-27
    • /
    • 2022
  • Across the world, 'housing' comprises a significant portion of wealth and assets. For this reason, fluctuations in real estate prices are highly sensitive issues to individual households. In Korea, housing prices have steadily increased over the years, and thus many Koreans view the real estate market as an effective channel for their investments. However, if one purchases a real estate property for the purpose of investing, then there are several risks involved when prices begin to fluctuate. The purpose of this study is to design a real estate price 'return rate' prediction model to help mitigate the risks involved with real estate investments and promote reasonable real estate purchases. Various approaches are explored to develop a model capable of predicting real estate prices based on an understanding of the immovability of the real estate market. This study employs the LSTM method, which is based on artificial intelligence and deep learning, to predict real estate prices and validate the model. LSTM networks are based on recurrent neural networks (RNN) but add cell states (which act as a type of conveyer belt) to the hidden states. LSTM networks are able to obtain cell states and hidden states in a recursive manner. Data on the actual trading prices of apartments in autonomous districts between January 2006 and December 2019 are collected from the Actual Trading Price Disclosure System of the Ministry of Land, Infrastructure and Transport (MOLIT). Additionally, basic data on apartments and commercial buildings are collected from the Public Data Portal and Seoul Metropolitan Government's data portal. The collected actual trading price data are scaled to monthly average trading amounts, and each data entry is pre-processed according to address to produce 168 data entries. An LSTM model for return rate prediction is prepared based on a time series dataset where the training period is set as April 2015~August 2017 (29 months), the validation period is set as September 2017~September 2018 (13 months), and the test period is set as December 2018~December 2019 (13 months). The results of the return rate prediction study are as follows. First, the model achieved a prediction similarity level of almost 76%. After collecting time series data and preparing the final prediction model, it was confirmed that 76% of models could be achieved. All in all, the results demonstrate the reliability of the LSTM-based model for return rate prediction.

Breast Cytology Diagnosis using a Hybrid Case-based Reasoning and Genetic Algorithms Approach

  • Ahn, Hyun-Chul;Kim, Kyoung-Jae
    • Proceedings of the Korea Inteligent Information System Society Conference
    • /
    • 2007.05a
    • /
    • pp.389-398
    • /
    • 2007
  • Case-based reasoning (CBR) is one of the most popular prediction techniques for medical diagnosis because it is easy to apply, has no possibility of overfitting, and provides a good explanation for the output. However, it has a critical limitation - its prediction performance is generally lower than other artificial intelligence techniques like artificial neural networks (ANNs). In order to obtain accurate results from CBR, effective retrieval and matching of useful prior cases for the problem is essential, but it is still a controversial issue to design a good matching and retrieval mechanism for CBR systems. In this study, we propose a novel approach to enhance the prediction performance of CBR. Our suggestion is the simultaneous optimization of feature weights, instance selection, and the number of neighbors that combine using genetic algorithms (GAs). Our model improves the prediction performance in three ways - (1) measuring similarity between cases more accurately by considering relative importance of each feature, (2) eliminating redundant or erroneous reference cases, and (3) combining several similar cases represent significant patterns. To validate the usefulness of our model, this study applied it to a real-world case for evaluating cytological features derived directly from a digital scan of breast fine needle aspirate (FNA) slides. Experimental results showed that the prediction accuracy of conventional CBR may be improved significantly by using our model. We also found that our proposed model outperformed all the other optimized models for CBR using GA.

  • PDF

Radiation Prediction Based on Multi Deep Learning Model Using Weather Data and Weather Satellites Image (기상 데이터와 기상 위성 영상을 이용한 다중 딥러닝 모델 기반 일사량 예측)

  • Jae-Jung Kim;Yong-Hun You;Chang-Bok Kim
    • Journal of Advanced Navigation Technology
    • /
    • v.25 no.6
    • /
    • pp.569-575
    • /
    • 2021
  • Deep learning shows differences in prediction performance depending on data quality and model. This study uses various input data and multiple deep learning models to build an optimal deep learning model for predicting solar radiation, which has the most influence on power generation prediction. did. As the input data, the weather data of the Korea Meteorological Administration and the clairvoyant meteorological image were used by segmenting the image of the Korea Meteorological Agency. , comparative evaluation, and predicting solar radiation by constructing multiple deep learning models connecting the models with the best error rate in each model. As an experimental result, the RMSE of model A, which is a multiple deep learning model, was 0.0637, the RMSE of model B was 0.07062, and the RMSE of model C was 0.06052, so the error rate of model A and model C was better than that of a single model. In this study, the model that connected two or more models through experiments showed improved prediction rates and stable learning results.

Hybrid machine learning with moth-flame optimization methods for strength prediction of CFDST columns under compression

  • Quang-Viet Vu;Dai-Nhan Le;Thai-Hoan Pham;Wei Gao;Sawekchai Tangaramvong
    • Steel and Composite Structures
    • /
    • v.51 no.6
    • /
    • pp.679-695
    • /
    • 2024
  • This paper presents a novel technique that combines machine learning (ML) with moth-flame optimization (MFO) methods to predict the axial compressive strength (ACS) of concrete filled double skin steel tubes (CFDST) columns. The proposed model is trained and tested with a dataset containing 125 tests of the CFDST column subjected to compressive loading. Five ML models, including extreme gradient boosting (XGBoost), gradient tree boosting (GBT), categorical gradient boosting (CAT), support vector machines (SVM), and decision tree (DT) algorithms, are utilized in this work. The MFO algorithm is applied to find optimal hyperparameters of these ML models and to determine the most effective model in predicting the ACS of CFDST columns. Predictive results given by some performance metrics reveal that the MFO-CAT model provides superior accuracy compared to other considered models. The accuracy of the MFO-CAT model is validated by comparing its predictive results with existing design codes and formulae. Moreover, the significance and contribution of each feature in the dataset are examined by employing the SHapley Additive exPlanations (SHAP) method. A comprehensive uncertainty quantification on probabilistic characteristics of the ACS of CFDST columns is conducted for the first time to examine the models' responses to variations of input variables in the stochastic environments. Finally, a web-based application is developed to predict ACS of the CFDST column, enabling rapid practical utilization without requesting any programing or machine learning expertise.

Hybrid machine learning with HHO method for estimating ultimate shear strength of both rectangular and circular RC columns

  • Quang-Viet Vu;Van-Thanh Pham;Dai-Nhan Le;Zhengyi Kong;George Papazafeiropoulos;Viet-Ngoc Pham
    • Steel and Composite Structures
    • /
    • v.52 no.2
    • /
    • pp.145-163
    • /
    • 2024
  • This paper presents six novel hybrid machine learning (ML) models that combine support vector machines (SVM), Decision Tree (DT), Random Forest (RF), Gradient Boosting (GB), extreme gradient boosting (XGB), and categorical gradient boosting (CGB) with the Harris Hawks Optimization (HHO) algorithm. These models, namely HHO-SVM, HHO-DT, HHO-RF, HHO-GB, HHO-XGB, and HHO-CGB, are designed to predict the ultimate strength of both rectangular and circular reinforced concrete (RC) columns. The prediction models are established using a comprehensive database consisting of 325 experimental data for rectangular columns and 172 experimental data for circular columns. The ML model hyperparameters are optimized through a combination of cross-validation technique and the HHO. The performance of the hybrid ML models is evaluated and compared using various metrics, ultimately identifying the HHO-CGB model as the top-performing model for predicting the ultimate shear strength of both rectangular and circular RC columns. The mean R-value and mean a20-index are relatively high, reaching 0.991 and 0.959, respectively, while the mean absolute error and root mean square error are low (10.302 kN and 27.954 kN, respectively). Another comparison is conducted with four existing formulas to further validate the efficiency of the proposed HHO-CGB model. The Shapely Additive Explanations method is applied to analyze the contribution of each variable to the output within the HHO-CGB model, providing insights into the local and global influence of variables. The analysis reveals that the depth of the column, length of the column, and axial loading exert the most significant influence on the ultimate shear strength of RC columns. A user-friendly graphical interface tool is then developed based on the HHO-CGB to facilitate practical and cost-effective usage.

A New Residual Attention Network based on Attention Models for Human Action Recognition in Video

  • Kim, Jee-Hyun;Cho, Young-Im
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.1
    • /
    • pp.55-61
    • /
    • 2020
  • With the development of deep learning technology and advances in computing power, video-based research is now gaining more and more attention. Video data contains a large amount of temporal and spatial information, which is the biggest difference compared with image data. It has a larger amount of data. It has attracted intense attention in computer vision. Among them, motion recognition is one of the research focuses. However, the action recognition of human in the video is extremely complex and challenging subject. Based on many research in human beings, we have found that artificial intelligence-like attention mechanisms are an efficient model for cognition. This efficient model is ideal for processing image information and complex continuous video information. We introduce this attention mechanism into video action recognition, paying attention to human actions in video and effectively improving recognition efficiency. In this paper, we propose a new 3D residual attention network using convolutional neural network based on two attention models to identify human action behavior in the video. An evaluation result of our model showed up to 90.7% accuracy.

Factors affecting success and failure of Internet company business model using inductive learning based on ID3 algorithm (ID3 알고리즘 기반의 귀납적 추론을 활용한 인터넷 기업 비즈니스 모델의 성공과 실패에 영향을 미치는 요인에 관한 연구)

  • Jin, Dong-su
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.23 no.2
    • /
    • pp.111-116
    • /
    • 2019
  • New technologies such as the IoT, Big Data, and Artificial Intelligence, starting from the Web, mobile, and smart device, enable new business models that did not exist before, and various types of Internet companies based on these business models has been emerged. In this research, we examine the factors that influence the success and failure of Internet companies. To do this, we review the recent studies on business model and examine the variables affecting the success of Internet companies in terms of network effect, user interface, cooperation with actors, creating value for users. Using the five derived variables, we will select 14 Internet companies that succeeded and failed in seven commercial business model categories. We derive decision tree by applying inductive learning based on ID3 algorithm to the analysis result and derive rules that affect success and failure based on derived decision tree. With these rules, we want to present the strategic implications for actors to succeed in Internet companies.

A Multilayer Perceptron-Based Electric Load Forecasting Scheme via Effective Recovering Missing Data (효과적인 결측치 보완을 통한 다층 퍼셉트론 기반의 전력수요 예측 기법)

  • Moon, Jihoon;Park, Sungwoo;Hwang, Eenjun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.8 no.2
    • /
    • pp.67-78
    • /
    • 2019
  • Accurate electric load forecasting is very important in the efficient operation of the smart grid. Recently, due to the development of IT technology, many works for constructing accurate forecasting models have been developed based on big data processing using artificial intelligence techniques. These forecasting models usually utilize external factors such as temperature, humidity and historical electric load as independent variables. However, due to diverse internal and external factors, historical electrical load contains many missing data, which makes it very difficult to construct an accurate forecasting model. To solve this problem, in this paper, we propose a random forest-based missing data recovery scheme and construct an electric load forecasting model based on multilayer perceptron using the estimated values of missing data and external factors. We demonstrate the performance of our proposed scheme via various experiments.

Numerical evaluation of gamma radiation monitoring

  • Rezaei, Mohsen;Ashoor, Mansour;Sarkhosh, Leila
    • Nuclear Engineering and Technology
    • /
    • v.51 no.3
    • /
    • pp.807-817
    • /
    • 2019
  • Airborne Gamma Ray Spectrometry (AGRS) with its important applications such as gathering radiation information of ground surface, geochemistry measuring of the abundance of Potassium, Thorium and Uranium in outer earth layer, environmental and nuclear site surveillance has a key role in the field of nuclear science and human life. The Broyden-Fletcher-Goldfarb-Shanno (BFGS), with its advanced numerical unconstrained nonlinear optimization in collaboration with Artificial Neural Networks (ANNs) provides a noteworthy opportunity for modern AGRS. In this study a new AGRS system empowered by ANN-BFGS has been proposed and evaluated on available empirical AGRS data. To that effect different architectures of adaptive ANN-BFGS were implemented for a sort of published experimental AGRS outputs. The selected approach among of various training methods, with its low iteration cost and nondiagonal scaling allocation is a new powerful algorithm for AGRS data due to its inherent stochastic properties. Experiments were performed by different architectures and trainings, the selected scheme achieved the smallest number of epochs, the minimum Mean Square Error (MSE) and the maximum performance in compare with different types of optimization strategies and algorithms. The proposed method is capable to be implemented on a cost effective and minimum electronic equipment to present its real-time process, which will let it to be used on board a light Unmanned Aerial Vehicle (UAV). The advanced adaptation properties and models of neural network, the training of stochastic process and its implementation on DSP outstands an affordable, reliable and low cost AGRS design. The main outcome of the study shows this method increases the quality of curvature information of AGRS data while cost of the algorithm is reduced in each iteration so the proposed ANN-BFGS is a trustworthy appropriate model for Gamma-ray data reconstruction and analysis based on advanced novel artificial intelligence systems.

Application of ANFIS for Prediction of Daily Water Supply (상수도 1일 급수량 예측을 위한 ANFIS적용)

  • Rhee, Kyoung-Hoon;Kang, Il-Hwan;Moon, Byoung-Seok
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.14 no.3
    • /
    • pp.281-290
    • /
    • 2000
  • This study investigates the prediction of daily water supply, which is a necessary for the efficient management of water distribution system. ANFIS, namely artificial intelligence, is a neural network into which fuzzy information is inputted and then processed. In this study, daily water supply was predicted through an application of network-based fuzzy inference system(ANFIS) for daily water supply prediction. This study was investigated methods for predicting water supply based on data about the amount of water which supplied in Kwangju city. For variables choice, four analyses of input data were conducted: correlation analysis, autocorrelation analysis, partial autocorrelation analysis, and cross-correlation analysis. Input variables were (a) the amount of water supply, (b) the mean temperature, and (c) the population of the area supplied with water. Variables were combined in an integrated model. Data of the amount of daily water supply only was modelled and its validity was verified in the case that the meteorological office of weather forecast is not always reliable. Proposed models include accidental cases such as a suspension of water supply. The maximum error rate between the estimation of the model and the actual measurement was 18.46% and the average error was lower than 2.36%. The model is expected to be a real-time estimation of the operational control of water works and water/drain pipes.

  • PDF