• Title/Summary/Keyword: security metrics

Search Result 129, Processing Time 0.022 seconds

KAB: Knowledge Augmented BERT2BERT Automated Questions-Answering system for Jurisprudential Legal Opinions

  • Alotaibi, Saud S.;Munshi, Amr A.;Farag, Abdullah Tarek;Rakha, Omar Essam;Al Sallab, Ahmad A.;Alotaibi, Majid
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.6
    • /
    • pp.346-356
    • /
    • 2022
  • The jurisprudential legal rules govern the way Muslims react and interact to daily life. This creates a huge stream of questions, that require highly qualified and well-educated individuals, called Muftis. With Muslims representing almost 25% of the planet population, and the scarcity of qualified Muftis, this creates a demand supply problem calling for Automation solutions. This motivates the application of Artificial Intelligence (AI) to solve this problem, which requires a well-designed Question-Answering (QA) system to solve it. In this work, we propose a QA system, based on retrieval augmented generative transformer model for jurisprudential legal question. The main idea in the proposed architecture is the leverage of both state-of-the art transformer models, and the existing knowledge base of legal sources and question-answers. With the sensitivity of the domain in mind, due to its importance in Muslims daily lives, our design balances between exploitation of knowledge bases, and exploration provided by the generative transformer models. We collect a custom data set of 850,000 entries, that includes the question, answer, and category of the question. Our evaluation methodology is based on both quantitative and qualitative methods. We use metrics like BERTScore and METEOR to evaluate the precision and recall of the system. We also provide many qualitative results that show the quality of the generated answers, and how relevant they are to the asked questions.

Evaluation and Comparative Analysis of Scalability and Fault Tolerance for Practical Byzantine Fault Tolerant based Blockchain (프랙티컬 비잔틴 장애 허용 기반 블록체인의 확장성과 내결함성 평가 및 비교분석)

  • Lee, Eun-Young;Kim, Nam-Ryeong;Han, Chae-Rim;Lee, Il-Gu
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.2
    • /
    • pp.271-277
    • /
    • 2022
  • PBFT (Practical Byzantine Fault Tolerant) is a consensus algorithm that can achieve consensus by resolving unintentional and intentional faults in a distributed network environment and can guarantee high performance and absolute finality. However, as the size of the network increases, the network load also increases due to message broadcasting that repeatedly occurs during the consensus process. Due to the characteristics of the PBFT algorithm, it is suitable for small/private blockchain, but there is a limit to its application to large/public blockchain. Because PBFT affects the performance of blockchain networks, the industry should test whether PBFT is suitable for products and services, and academia needs a unified evaluation metric and technology for PBFT performance improvement research. In this paper, quantitative evaluation metrics and evaluation frameworks that can evaluate PBFT family consensus algorithms are studied. In addition, the throughput, latency, and fault tolerance of PBFT are evaluated using the proposed PBFT evaluation framework.

An Application of Machine Learning in Retail for Demand Forecasting

  • Muhammad Umer Farooq;Mustafa Latif;Waseemullah;Mirza Adnan Baig;Muhammad Ali Akhtar;Nuzhat Sana
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.9
    • /
    • pp.1-7
    • /
    • 2023
  • Demand prediction is an essential component of any business or supply chain. Large retailers need to keep track of tens of millions of items flows each day to ensure smooth operations and strong margins. The demand prediction is in the epicenter of this planning tornado. For business processes in retail companies that deal with a variety of products with short shelf life and foodstuffs, forecast accuracy is of the utmost importance due to the shifting demand pattern, which is impacted by an environment of dynamic and fast response. All sectors strive to produce the ideal quantity of goods at the ideal time, but for retailers, this issue is especially crucial as they also need to effectively manage perishable inventories. In light of this, this research aims to show how Machine Learning approaches can help with demand forecasting in retail and future sales predictions. This will be done in two steps. One by using historic data and another by using open data of weather conditions, fuel, Consumer Price Index (CPI), holidays, any specific events in that area etc. Several machine learning algorithms were applied and compared using the r-squared and mean absolute percentage error (MAPE) assessment metrics. The suggested method improves the effectiveness and quality of feature selection while using a small number of well-chosen features to increase demand prediction accuracy. The model is tested with a one-year weekly dataset after being trained with a two-year weekly dataset. The results show that the suggested expanded feature selection approach provides a very good MAPE range, a very respectable and encouraging value for anticipating retail demand in retail systems.

Personalized Diabetes Risk Assessment Through Multifaceted Analysis (PD- RAMA): A Novel Machine Learning Approach to Early Detection and Management of Type 2 Diabetes

  • Gharbi Alshammari
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.8
    • /
    • pp.17-25
    • /
    • 2023
  • The alarming global prevalence of Type 2 Diabetes Mellitus (T2DM) has catalyzed an urgent need for robust, early diagnostic methodologies. This study unveils a pioneering approach to predicting T2DM, employing the Extreme Gradient Boosting (XGBoost) algorithm, renowned for its predictive accuracy and computational efficiency. The investigation harnesses a meticulously curated dataset of 4303 samples, extracted from a comprehensive Chinese research study, scrupulously aligned with the World Health Organization's indicators and standards. The dataset encapsulates a multifaceted spectrum of clinical, demographic, and lifestyle attributes. Through an intricate process of hyperparameter optimization, the XGBoost model exhibited an unparalleled best score, elucidating a distinctive combination of parameters such as a learning rate of 0.1, max depth of 3, 150 estimators, and specific colsample strategies. The model's validation accuracy of 0.957, coupled with a sensitivity of 0.9898 and specificity of 0.8897, underlines its robustness in classifying T2DM. A detailed analysis of the confusion matrix further substantiated the model's diagnostic prowess, with an F1-score of 0.9308, illustrating its balanced performance in true positive and negative classifications. The precision and recall metrics provided nuanced insights into the model's ability to minimize false predictions, thereby enhancing its clinical applicability. The research findings not only underline the remarkable efficacy of XGBoost in T2DM prediction but also contribute to the burgeoning field of machine learning applications in personalized healthcare. By elucidating a novel paradigm that accentuates the synergistic integration of multifaceted clinical parameters, this study fosters a promising avenue for precise early detection, risk stratification, and patient-centric intervention in diabetes care. The research serves as a beacon, inspiring further exploration and innovation in leveraging advanced analytical techniques for transformative impacts on predictive diagnostics and chronic disease management.

An Application of Machine Learning in Retail for Demand Forecasting

  • Muhammad Umer Farooq;Mustafa Latif;Waseem;Mirza Adnan Baig;Muhammad Ali Akhtar;Nuzhat Sana
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.8
    • /
    • pp.210-216
    • /
    • 2023
  • Demand prediction is an essential component of any business or supply chain. Large retailers need to keep track of tens of millions of items flows each day to ensure smooth operations and strong margins. The demand prediction is in the epicenter of this planning tornado. For business processes in retail companies that deal with a variety of products with short shelf life and foodstuffs, forecast accuracy is of the utmost importance due to the shifting demand pattern, which is impacted by an environment of dynamic and fast response. All sectors strive to produce the ideal quantity of goods at the ideal time, but for retailers, this issue is especially crucial as they also need to effectively manage perishable inventories. In light of this, this research aims to show how Machine Learning approaches can help with demand forecasting in retail and future sales predictions. This will be done in two steps. One by using historic data and another by using open data of weather conditions, fuel, Consumer Price Index (CPI), holidays, any specific events in that area etc. Several machine learning algorithms were applied and compared using the r-squared and mean absolute percentage error (MAPE) assessment metrics. The suggested method improves the effectiveness and quality of feature selection while using a small number of well-chosen features to increase demand prediction accuracy. The model is tested with a one-year weekly dataset after being trained with a two-year weekly dataset. The results show that the suggested expanded feature selection approach provides a very good MAPE range, a very respectable and encouraging value for anticipating retail demand in retail systems.

Intelligent System for the Prediction of Heart Diseases Using Machine Learning Algorithms with Anew Mixed Feature Creation (MFC) technique

  • Rawia Elarabi;Abdelrahman Elsharif Karrar;Murtada El-mukashfi El-taher
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.5
    • /
    • pp.148-162
    • /
    • 2023
  • Classification systems can significantly assist the medical sector by allowing for the precise and quick diagnosis of diseases. As a result, both doctors and patients will save time. A possible way for identifying risk variables is to use machine learning algorithms. Non-surgical technologies, such as machine learning, are trustworthy and effective in categorizing healthy and heart-disease patients, and they save time and effort. The goal of this study is to create a medical intelligent decision support system based on machine learning for the diagnosis of heart disease. We have used a mixed feature creation (MFC) technique to generate new features from the UCI Cleveland Cardiology dataset. We select the most suitable features by using Least Absolute Shrinkage and Selection Operator (LASSO), Recursive Feature Elimination with Random Forest feature selection (RFE-RF) and the best features of both LASSO RFE-RF (BLR) techniques. Cross-validated and grid-search methods are used to optimize the parameters of the estimator used in applying these algorithms. and classifier performance assessment metrics including classification accuracy, specificity, sensitivity, precision, and F1-Score, of each classification model, along with execution time and RMSE the results are presented independently for comparison. Our proposed work finds the best potential outcome across all available prediction models and improves the system's performance, allowing physicians to diagnose heart patients more accurately.

Performance Evaluation of SDN Controllers: RYU and POX for WBAN-based Healthcare Applications

  • Lama Alfaify;Nujud Alnajem;Haya Alanzi;Rawan Almutiri;Areej Alotaibi;Nourah Alhazri;Awatif Alqahtani
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.7
    • /
    • pp.219-230
    • /
    • 2023
  • Wireless Body Area Networks (WBANs) have made it easier for healthcare workers and patients to monitor patients' status continuously in real time. WBANs have complex and diverse network structures; thus, management and control can be challenging. Therefore, considering emerging Software-defined networks (SDN) with WBANs is a promising technology since SDN implements a new network management and design approach. The SDN concept is used in this study to create more adaptable and dynamic network architectures for WBANs. The study focuses on comparing the performance of two SDN controllers, POX and Ryu, using Mininet, an open-source simulation tool, to construct network topologies. The performance of the controllers is evaluated based on bandwidth, throughput, and round-trip time metrics for networks using an OpenFlow switch with sixteen nodes and a controller for each topology. The study finds that the choice of network controller can significantly impact network performance and suggests that monitoring network performance indicators is crucial for optimizing network performance. The project provides valuable insights into the performance of SDN-based WBANs using POX and Ryu controllers and highlights the importance of selecting the appropriate network controller for a given network architecture.

The Performance Analysis of Cognitive-based Overlay D2D Communication in 5G Networks

  • Abdullilah Alotaibi;Salman A. AlQahtani
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.2
    • /
    • pp.178-188
    • /
    • 2024
  • In the near future, it is expected that there will be billions of connected devices using fifth generation (5G) network services. The recently available base stations (BSs) need to mitigate their loads without changing and at the least monetary cost. The available spectrum resources are limited and need to be exploited in an efficient way to meet the ever-increasing demand for services. Device to Device communication (D2D) technology will likely help satisfy the rapidly increasing capacity and also effectively offload traffic from the BS by distributing the transmission between D2D users from one side and the cellular users and the BS from the other side. In this paper, we propose to apply D2D overlay communication with cognitive radio capability in 5G networks to exploit unused spectrum resources taking into account the dynamic spectrum access. The performance metrics; throughput and delay are formulated and analyzed for CSMA-based medium access control (MAC) protocol that utilizes a common control channel for device users to negotiate the data channel and address the contention between those users. Device users can exploit the cognitive radio to access the data channels concurrently in the common interference area. Estimating the achievable throughput and delay in D2D communication in 5G networks is not exploited in previous studies using cognitive radio with CSMA-based MAC protocol to address the contention. From performance analysis, applying cognitive radio capability in D2D communication and allocating a common control channel for device users effectively improve the total aggregated network throughput by more than 60% compared to the individual D2D throughput without adding harmful interference to cellular network users. This approach can also reduce the delay.

A Review on Detection of COVID-19 Cases from Medical Images Using Machine Learning-Based Approach

  • Noof Al-dieef;Shabana Habib
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.3
    • /
    • pp.59-70
    • /
    • 2024
  • Background: The COVID-19 pandemic (the form of coronaviruses) developed at the end of 2019 and spread rapidly to almost every corner of the world. It has infected around 25,334,339 of the world population by the end of September 1, 2020 [1] . It has been spreading ever since, and the peak specific to every country has been rising and falling and does not seem to be over yet. Currently, the conventional RT-PCR testing is required to detect COVID-19, but the alternative method for data archiving purposes is certainly another choice for public departments to make. Researchers are trying to use medical images such as X-ray and Computed Tomography (CT) to easily diagnose the virus with the aid of Artificial Intelligence (AI)-based software. Method: This review paper provides an investigation of a newly emerging machine-learning method used to detect COVID-19 from X-ray images instead of using other methods of tests performed by medical experts. The facilities of computer vision enable us to develop an automated model that has clinical abilities of early detection of the disease. We have explored the researchers' focus on the modalities, images of datasets for use by the machine learning methods, and output metrics used to test the research in this field. Finally, the paper concludes by referring to the key problems posed by identifying COVID-19 using machine learning and future work studies. Result: This review's findings can be useful for public and private sectors to utilize the X-ray images and deployment of resources before the pandemic can reach its peaks, enabling the healthcare system with cushion time to bear the impact of the unfavorable circumstances of the pandemic is sure to cause

Analysis of deep learning-based deep clustering method (딥러닝 기반의 딥 클러스터링 방법에 대한 분석)

  • Hyun Kwon;Jun Lee
    • Convergence Security Journal
    • /
    • v.23 no.4
    • /
    • pp.61-70
    • /
    • 2023
  • Clustering is an unsupervised learning method that involves grouping data based on features such as distance metrics, using data without known labels or ground truth values. This method has the advantage of being applicable to various types of data, including images, text, and audio, without the need for labeling. Traditional clustering techniques involve applying dimensionality reduction methods or extracting specific features to perform clustering. However, with the advancement of deep learning models, research on deep clustering techniques using techniques such as autoencoders and generative adversarial networks, which represent input data as latent vectors, has emerged. In this study, we propose a deep clustering technique based on deep learning. In this approach, we use an autoencoder to transform the input data into latent vectors, and then construct a vector space according to the cluster structure and perform k-means clustering. We conducted experiments using the MNIST and Fashion-MNIST datasets in the PyTorch machine learning library as the experimental environment. The model used is a convolutional neural network-based autoencoder model. The experimental results show an accuracy of 89.42% for MNIST and 56.64% for Fashion-MNIST when k is set to 10.