• Title/Summary/Keyword: artificial intelligence algorithm

Search Result 876, Processing Time 0.024 seconds

[Reivew]Prediction of Cervical Cancer Risk from Taking Hormone Contraceptivese

  • Su jeong RU;Kyung-A KIM;Myung-Ae CHUNG;Min Soo KANG
    • Korean Journal of Artificial Intelligence
    • /
    • v.12 no.1
    • /
    • pp.25-29
    • /
    • 2024
  • In this study, research was conducted to predict the probability of cervical cancer occurrence associated with the use of hormonal contraceptives. Cervical cancer is influenced by various environmental factors; however, the human papillomavirus (HPV) is detected in 99% of cases, making it the primary attributed cause. Additionally, although cervical cancer ranks 10th in overall female cancer incidence, it is nearly 100% preventable among known cancers. Early-stage cervical cancer typically presents no symptoms but can be detected early through regular screening. Therefore, routine tests, including cytology, should be conducted annually, as early detection significantly improves the chances of successful treatment. Thus, we employed artificial intelligence technology to forecast the likelihood of developing cervical cancer. We utilized the logistic regression algorithm, a predictive model, through Microsoft Azure. The classification model yielded an accuracy of 80.8%, a precision of 80.2%, a recall rate of 99.0%, and an F1 score of 88.6%. These results indicate that the use of hormonal contraceptives is associated with an increased risk of cervical cancer. Further development of the artificial intelligence program, as studied here, holds promise for reducing mortality rates attributable to cervical cancer.

A Methodology for Bankruptcy Prediction in Imbalanced Datasets using eXplainable AI (데이터 불균형을 고려한 설명 가능한 인공지능 기반 기업부도예측 방법론 연구)

  • Heo, Sun-Woo;Baek, Dong Hyun
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.45 no.2
    • /
    • pp.65-76
    • /
    • 2022
  • Recently, not only traditional statistical techniques but also machine learning algorithms have been used to make more accurate bankruptcy predictions. But the insolvency rate of companies dealing with financial institutions is very low, resulting in a data imbalance problem. In particular, since data imbalance negatively affects the performance of artificial intelligence models, it is necessary to first perform the data imbalance process. In additional, as artificial intelligence algorithms are advanced for precise decision-making, regulatory pressure related to securing transparency of Artificial Intelligence models is gradually increasing, such as mandating the installation of explanation functions for Artificial Intelligence models. Therefore, this study aims to present guidelines for eXplainable Artificial Intelligence-based corporate bankruptcy prediction methodology applying SMOTE techniques and LIME algorithms to solve a data imbalance problem and model transparency problem in predicting corporate bankruptcy. The implications of this study are as follows. First, it was confirmed that SMOTE can effectively solve the data imbalance issue, a problem that can be easily overlooked in predicting corporate bankruptcy. Second, through the LIME algorithm, the basis for predicting bankruptcy of the machine learning model was visualized, and derive improvement priorities of financial variables that increase the possibility of bankruptcy of companies. Third, the scope of application of the algorithm in future research was expanded by confirming the possibility of using SMOTE and LIME through case application.

A Novel Approach to COVID-19 Diagnosis Based on Mel Spectrogram Features and Artificial Intelligence Techniques

  • Alfaidi, Aseel;Alshahrani, Abdullah;Aljohani, Maha
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.9
    • /
    • pp.195-207
    • /
    • 2022
  • COVID-19 has remained one of the most serious health crises in recent history, resulting in the tragic loss of lives and significant economic impacts on the entire world. The difficulty of controlling COVID-19 poses a threat to the global health sector. Considering that Artificial Intelligence (AI) has contributed to improving research methods and solving problems facing diverse fields of study, AI algorithms have also proven effective in disease detection and early diagnosis. Specifically, acoustic features offer a promising prospect for the early detection of respiratory diseases. Motivated by these observations, this study conceptualized a speech-based diagnostic model to aid in COVID-19 diagnosis. The proposed methodology uses speech signals from confirmed positive and negative cases of COVID-19 to extract features through the pre-trained Visual Geometry Group (VGG-16) model based on Mel spectrogram images. This is used in addition to the K-means algorithm that determines effective features, followed by a Genetic Algorithm-Support Vector Machine (GA-SVM) classifier to classify cases. The experimental findings indicate the proposed methodology's capability to classify COVID-19 and NOT COVID-19 of varying ages and speaking different languages, as demonstrated in the simulations. The proposed methodology depends on deep features, followed by the dimension reduction technique for features to detect COVID-19. As a result, it produces better and more consistent performance than handcrafted features used in previous studies.

A Study on Algorithm Selection and Comparison for Improving the Performance of an Artificial Intelligence Product Recognition Automatic Payment System

  • Kim, Heeyoung;Kim, Dongmin;Ryu, Gihwan;Hong, Hotak
    • International Journal of Advanced Culture Technology
    • /
    • v.10 no.1
    • /
    • pp.230-235
    • /
    • 2022
  • This study is to select an optimal object detection algorithm for designing a self-checkout counter to improve the inconvenience of payment systems for products without existing barcodes. To this end, a performance comparison analysis of YOLO v2, Tiny YOLO v2, and the latest YOLO v5 among deep learning-based object detection algorithms was performed to derive results. In this paper, performance comparison was conducted by forming learning data as an example of 'donut' in a bakery store, and the performance result of YOLO v5 was the highest at 96.9% of mAP. Therefore, YOLO v5 was selected as the artificial intelligence object detection algorithm to be applied in this paper. As a result of performance analysis, when the optimal threshold was set for each donut, the precision and reproduction rate of all donuts exceeded 0.85, and the majority of donuts showed excellent recognition performance of 0.90 or more. We expect that the results of this paper will be helpful as the fundamental data for the development of an automatic payment system using AI self-service technology that is highly usable in the non-face-to-face era.

A customer credit Prediction Researched to Improve Credit Stability based on Artificial Intelligence

  • MUN, Ji-Hui;JUNG, Sang Woo
    • Korean Journal of Artificial Intelligence
    • /
    • v.9 no.1
    • /
    • pp.21-27
    • /
    • 2021
  • In this Paper, Since the 1990s, Korea's credit card industry has steadily developed. As a result, various problems have arisen, such as careless customer information management and loans to low-credit customers. This, in turn, had a high delinquency rate across the card industry and a negative impact on the economy. Therefore, in this paper, based on Azure, we analyze and predict the delinquency and delinquency periods of credit loans according to gender, own car, property, number of children, education level, marital status, and employment status through linear regression analysis and enhanced decision tree algorithm. These predictions can consequently reduce the likelihood of reckless credit lending and issuance of credit cards, reducing the number of bad creditors and reducing the risk of banks. In addition, after classifying and dividing the customer base based on the predicted result, it can be used as a basis for reducing the risk of credit loans by developing a credit product suitable for each customer. The predicted result through Azure showed that when predicting with Linear Regression and Boosted Decision Tree algorithm, the Boosted Decision Tree algorithm made more accurate prediction. In addition, we intend to increase the accuracy of the analysis by assigning a number to each data in the future and predicting again.

Developing an Artificial Intelligence Algorithm to Predict the Timing of Dialysis Vascular Surgery (투석혈관 수술시기 예측을 위한 인공지능 알고리즘 개발)

  • Kim Dohyoung;Kim Hyunsuk;Lee Sunpyo;Oh Injong;Park Seungbum
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.19 no.4
    • /
    • pp.97-115
    • /
    • 2023
  • In South Korea, chronic kidney disease(CKD) impacts around 4.6 million adults, leading to a high reliance on hemodialysis. For effective dialysis, vascular access is crucial, with decisions about vascular surgeries often made during dialysis sessions. Anticipating these needs could improve dialysis quality and patient comfort. This study investigates the use of Artificial Intelligence(AI) to predict the timing of surgeries for dialysis vessels, an area not extensively researched. We've developed an AI algorithm using predictive maintenance methods, transitioning from machine learning to a more advanced deep learning approach with Long Short-Term Memory(LSTM) models. The algorithm processes variables such as venous pressure, blood flow, and patient age, demonstrating high effectiveness with metrics exceeding 0.91. By shortening the data collection intervals, a more refined model can be obtained. Implementing this AI in clinical practice could notably enhance patient experience and the quality of medical services in dialysis, marking a significant advancement in the treatment of CKD.

Time-Series Estimation based AI Algorithm for Energy Management in a Virtual Power Plant System

  • Yeonwoo LEE
    • Korean Journal of Artificial Intelligence
    • /
    • v.12 no.1
    • /
    • pp.17-24
    • /
    • 2024
  • This paper introduces a novel approach to time-series estimation for energy load forecasting within Virtual Power Plant (VPP) systems, leveraging advanced artificial intelligence (AI) algorithms, namely Long Short-Term Memory (LSTM) and Seasonal Autoregressive Integrated Moving Average (SARIMA). Virtual power plants, which integrate diverse microgrids managed by Energy Management Systems (EMS), require precise forecasting techniques to balance energy supply and demand efficiently. The paper introduces a hybrid-method forecasting model combining a parametric-based statistical technique and an AI algorithm. The LSTM algorithm is particularly employed to discern pattern correlations over fixed intervals, crucial for predicting accurate future energy loads. SARIMA is applied to generate time-series forecasts, accounting for non-stationary and seasonal variations. The forecasting model incorporates a broad spectrum of distributed energy resources, including renewable energy sources and conventional power plants. Data spanning a decade, sourced from the Korea Power Exchange (KPX) Electrical Power Statistical Information System (EPSIS), were utilized to validate the model. The proposed hybrid LSTM-SARIMA model with parameter sets (1, 1, 1, 12) and (2, 1, 1, 12) demonstrated a high fidelity to the actual observed data. Thus, it is concluded that the optimized system notably surpasses traditional forecasting methods, indicating that this model offers a viable solution for EMS to enhance short-term load forecasting.

Research on detecting moving targets with an improved Kalman filter algorithm

  • Jia quan Zhou;Wei Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.9
    • /
    • pp.2348-2360
    • /
    • 2023
  • As science and technology evolve, object detection of moving objects has been widely used in the context of machine learning and artificial intelligence. Traditional moving object detection algorithms, however, are characterized by relatively poor real-time performance and low accuracy in detecting moving objects. To tackle this issue, this manuscript proposes a modified Kalman filter algorithm, which aims to expand the equations of the system with the Taylor series first, ignoring the higher order terms of the second order and above, when the nonlinear system is close to the linear form, then it uses standard Kalman filter algorithms to measure the situation of the system. which can not only detect moving objects accurately but also has better real-time performance and can be employed to predict the trajectory of moving objects. Meanwhile, the accuracy and real-time performance of the algorithm were experimentally verified.

Artificial Intelligence Application using Nutcracker Optimization Algorithm to Enhance Efficiency & Reliability of Power Systems via Optimal Setting and Sizing of Renewable Energy Sources as Distributed Generations in Radial Distribution Systems

  • Nawaf A. AlZahrani;Mohammad Hamza Awedh;Ali M. Rushdi
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.1
    • /
    • pp.31-44
    • /
    • 2024
  • People have been using more energy in the last years. Several research studies were conducted to develop sustainable energy sources that can produce clean energy to fulfill our energy requirements. Using renewable energy sources helps to decrease the harm to the environment caused by conventional power plants. Choosing the right location and capacity for DG-RESs can greatly impact the performance of Radial Distribution Systems. It is beneficial to have a good and stable electrical power supply with low energy waste and high effectiveness because it improves the performance and reliability of the system. This research investigates the ideal location and size for solar and wind power systems, which are popular methods for producing clean electricity. A new artificial intelligent algorithm called Nutcracker Optimization Algorithm (NOA) is used to find the best solution in two common electrical systems named IEEE 33 and 69 bus systems to examine the improvement in the efficiency & reliability of power system network by reducing power losses, making voltage deviation smaller, and improving voltage stability. Finally, the NOA method is compared with another method called PSO and developed Hybrid Algorithm (NOA+PSO) to validate the proposed algorithm effectiveness and enhancement of both efficiency and reliability aspects.

Current approaches of artificial intelligence in breakwaters - A review

  • Kundapura, Suman;Hegde, Arkal Vittal
    • Ocean Systems Engineering
    • /
    • v.7 no.2
    • /
    • pp.75-87
    • /
    • 2017
  • A breakwater has always been an ideal option to prevent shoreline erosion due to wave action as well as to maintain the tranquility in the lagoon area. The effects of the impinging wave on the structure could be analyzed and evaluated by several physical and numerical methods. An alternate approach to the numerical methods in the prediction of performance of a breakwater is Artificial Intelligence (AI) tools. In the recent decade many researchers have implemented several Artificial Intelligence (AI) tools in the prediction of performance, stability number and scour of breakwaters. This paper is a comprehensive review which serves as a guide to the current state of the art knowledge in application of soft computing techniques in breakwaters. This study aims to provide a detailed review of different soft computing techniques used in the prediction of performance of different breakwaters considering various combinations of input and response variables.