Volume 23 Issue 5
-
Text mining (TM) is most widely used to process the various unstructured text documents and process the data present in the various domains. The other name for text mining is text classification. This domain is most popular in many domains such as movie reviews, product reviews on various E-commerce websites, sentiment analysis, topic modeling and cyber bullying on social media messages. Cyber-bullying is the type of abusing someone with the insulting language. Personal abusing, sexual harassment, other types of abusing come under cyber-bullying. Several existing systems are developed to detect the bullying words based on their situation in the social networking sites (SNS). SNS becomes platform for bully someone. In this paper, An Enhanced text mining approach is developed by using Ensemble Algorithm (ETMA) to solve several problems in traditional algorithms and improve the accuracy, processing time and quality of the result. ETMA is the algorithm used to analyze the bullying text within the social networking sites (SNS) such as facebook, twitter etc. The ETMA is applied on synthetic dataset collected from various data a source which consists of 5k messages belongs to bullying and non-bullying. The performance is analyzed by showing Precision, Recall, F1-Score and Accuracy.
-
Remote sensor systems are utilized in a few applications, including military, restorative, ecological and family unit. In every one of these applications, vitality use is the deciding component in the execution of wireless sensor systems. Thusly, strategies for information steering and exchanging to the base station are critical in light of the fact that the sensor hubs keep running on battery control and the vitality accessible for sensors is constrained. There are two explanations for the various leveled directing Low Energy Adaptive Clustering Hierarchy convention be in investigated. One, the sensor systems are thick and a considerable measure of excess is engaged with correspondence. Second, with a specific end goal to build the versatility of the sensor arrange remembering the security parts of correspondence. In this exploration paper usage of LEACH steering convention utilizing NS2 test system lastly upgraded vitality productive EE-LEACH directing convention guarantees that the chose cluster heads will be consistently conveyed over the system with a specific end goal to enhance the execution of the LEACH convention. EE-LEACH enhances vitality utilization by around 43%.
-
In connected vehicles, drivers are exposed to attacks when they communicate with unauthenticated peers. This occurs when a vehicle relies on outdated information resulting in interactions with vehicles that have expired or revoked certificates claiming to be legitimate nodes. Vehicles must frequently receive or query an updated revoked certificate list to avoid communicating with suspicious vehicles to protect themselves. In this paper, we propose a scheme that works on a highway divided into clusters and managed by roadside units (RSUs) to ensure authenticity and preserve hidden identities of vehicles. The proposed scheme includes four main components each of which plays a major role. In the top hierarchy, we have the authority that is responsible for issuing long-term certificates and managing and controlling all descending intermediate authorities, which cover specific regions (e.g., RSUs) and provide vehicles with short-term pseudonyms certificates to hide their identity and avoid traceability. Every certificate-related operation is recorded in a blockchain storage to ensure integrity and transparency. To regulate communication among nodes, security managers were introduced to enable authorization and access right during communications. Together, these components provide vehicles with an immediately revoked certificate list through RSUs, which are provided with publish/subscribe brokers that enable a controlled messaging infrastructure. We validate our work in a simulated smart highway environment comprising interconnected RSUs to demonstrate our technique's effectiveness.
-
The design of complex man-made systems mostly involves a conceptual modeling phase; therefore, it is important to ensure an appropriate analysis method for these models. A key concept for such analysis is the development of a diagramming technique (e.g., UML) because diagrams can describe entities and processes and emphasize important aspects of the systems being described. The analysis also includes an examination of ontological concepts such as states and events, which are used as a basis for the modeling process. Studying fundamental concepts allows us to understand more deeply the relationship between these concepts and modeling frameworks. In this paper, we critically analyze the classic definition of a state utilizing the Thinging machine (TM) model. States in state machine diagrams are considered the appropriate basis for modeling system behavioral aspects. Despite its wide application in hardware design, the integration of a state machine model into a software system's modeling requirements increased the difficulty of graphical representation (e.g., integration between structural and behavioral diagrams). To understand such a problem, in this paper, we project (create an equivalent representation of) states in TM machines. As a case study, we re-modeled a state machine of an assembly line system in a TM. Additionally, we added possible triggers (transitions) of the given states to the TM representation. The outcome is a complicated picture of assembly line behavior. Therefore, as an alternative solution, we re-modeled the assembly line based solely on the TM. This new model presents a clear contrast between state-based modeling of assembly line behavior and the TM approach. The TM modeling seems more systematic than its counterpart, the state machine, and its notions are well defined. In a TM, states are just compound events. A model of a more complex system than the one in the assembly line has strengthened such a conclusion.
-
The mainstay of current image recovery frameworks is Content-Based Image Retrieval (CBIR). The most distinctive retrieval method involves the submission of an image query, after which the system extracts visual characteristics such as shape, color, and texture from the images. Most of the techniques use RGB color space to extract and classify images as it is the default color space of the images when those techniques fail to change the color space of the images. To determine the most effective color space for retrieving images, this research discusses the transformation of RGB to different color spaces, feature extraction, and usage of Convolutional Neural Networks for retrieval.
-
Rawan Alrasheddi;Zainab Alawami;Maryam Hazazi;Reema Abu Alsaud;Ruba Alobaidi 41
Biometrics is an application of biometric authentication and identification techniques that are used for security. Where people can be identified by physical or behavioral features such as iris, fingerprints, or even voice. Biometrics with cryptography can be used in a variety of applications such as issuing, generating, or associating biometric keys. Biometric identification and cryptography are used in many institutions and high-security systems due to the difficulty of tampering or forgery by hackers. In this paper, literature reviews on biometric identification and cryptography are presented and discussed. In addition to a comparison of techniques in the literature reviews, identifying its strengths and weaknesses, and providing an initial proposal for biometrics and cryptography. -
In this paper a new fuzzy prediction is designed and developed to predict the type of delivery based on 7 factors. The developed system is highly needed to give a recommendation to the family excepting baby and at the same time provide an advisory system to the physician. The system has been developed using MATLAB and has been tested and verified using real data. The system shows high accuracy 95%. The results has been also checked one by one by a physician. The system shows perfect matching with the decision of the physician.
-
Early detection of endometrial carcinoma in uterus is essential for effective treatment. Endometrial carcinoma is the worst kind of endometrium cancer among the others since it is considerably more likely to affect the additional parts of the body if not detected and treated early. Non-invasive medical computer vision, also known as medical image processing, is becoming increasingly essential in the clinical diagnosis of various diseases. Such techniques provide a tool for automatic image processing, allowing for an accurate and timely assessment of the lesion. One of the most difficult aspects of developing an effective automatic categorization system is the absence of huge datasets. Using image processing and deep learning, this article presented an artificial endometrium cancer diagnosis system. The processes in this study include gathering a dermoscopy images from the database, preprocessing, segmentation using hybrid Fuzzy C-Means (FCM) and optimizing the weights using the Whale Optimization Algorithm (WOA). The characteristics of the damaged endometrium cells are retrieved using the feature extraction approach after the Magnetic Resonance pictures have been segmented. The collected characteristics are classified using a deep learning-based methodology called Long Short-Term Memory (LSTM) and Bi-directional LSTM classifiers. After using the publicly accessible data set, suggested classifiers obtain an accuracy of 97% and segmentation accuracy of 93%.
-
D. I. George Amalarethinam;S. Edel Josephine Rajakumari 65
Cloud Computing is one of the current research areas in computer science. Recently, Cloud is the buzz word used everywhere in IT industries; It introduced the notion of 'pay as you use' and revolutionized developments in IT. The rapid growth of modernized cloud computing leads to 24×7 accessing of e-resources from anywhere at any time. It offers storage as a service where users' data can be stored on a cloud which is managed by a third party who is called Cloud Service Provider (CSP). Since users' data are managed by a third party, it must be encrypted ensuring confidentiality and privacy of the data. There are different types of cryptographic algorithms used for cloud security; in this article, the algorithms and their security measures are discussed. -
Mustafa Abdul Salam;Sanaa Taha;Sameh Alahmady;Alwan Mohamed 73
Brain tumors can also be an abnormal collection or accumulation of cells in the brain that can be life-threatening due to their ability to invade and metastasize to nearby tissues. Accurate diagnosis is critical to the success of treatment planning, and resonant imaging is the primary diagnostic imaging method used to diagnose brain tumors and their extent. Deep learning methods for computer vision applications have shown significant improvements in recent years, primarily due to the undeniable fact that there is a large amount of data on the market to teach models. Therefore, improvements within the model architecture perform better approximations in the monitored configuration. Tumor classification using these deep learning techniques has made great strides by providing reliable, annotated open data sets. Reduce computational effort and learn specific spatial and temporal relationships. This white paper describes transfer models such as the MobileNet model, VGG19 model, InceptionResNetV2 model, Inception model, and DenseNet201 model. The model uses three different optimizers, Adam, SGD, and RMSprop. Finally, the pre-trained MobileNet with RMSprop optimizer is the best model in this paper, with 0.995 accuracies, 0.99 sensitivity, and 1.00 specificity, while at the same time having the lowest computational cost. -
Cloud-based technology is used in different organizations around the world for various purposes. Using this technology, the service providers provide the service mainly SaaS, PaaS and while the cloud service consumer consumes the services by paying for the service they used or accessed by the principle of "pay per use". The customer of the services can get any services being at different places or locations using different machines or electronic devices. Under the conditions of being well organized and having all necessary infrastructures, the services can be accessed suitably. The identified problem in this study is that cloud providers control and monitor the system or tools by ignoring the calculation and consideration of various faults made from the cloud provider side during service delivery. There are currently problems with ignoring the consumer or client during the monitoring and mentoring system for cloud services consumed at the customer or client level by SLA provisions. The new framework was developed to address the above-mentioned problems. The framework was developed as a unified modeling language. Eight basic components are used to develop the framework. For this research, the researcher developed a prototype by using a selected cloud tool to simulate and java programming language to write a code as well as MySQL to store data during SLA. The researcher used different criteria to validate the developed framework i.e. to validate SLA that is concerned with a cloud service provider, validate what happened when the request from the client-side is less than what is specified in SLA and above what is specified in SLA as well as implementing the monitoring mechanism using the developed Monitoring component. The researcher observed that with the 1st and 3rd criteria the service level agreement was violated and this indicated that if the Service level agreement is monitored or managed only by cloud service prover, there is a violation of LSA. Therefore, the researcher recommended that the service level agreement be managed by both cloud service providers and service consumers in the cloud computing environment.
-
Elmouez Samir Abd Elhameed Mohamed;Amgad Atta Abedelmageed Mohammed 95
The technological revolution, also referred to as the Fourth Industrial Revolution, it embraced by government should transform the way governments serve their citizens. The growth of broadband in Africa is leapfrogging its technological development. And thus, many African governments will soon be able to offer quality e-Government services to their citizens. These technologically driven governments will be able to provide decision-makers with timeous information to make judgments that could influence policies. Proponents of e-Government believe that in the digital age, governments can use this information to reduce corruption and increase, accountability, transparency, efficiency, and public participation. e-Government service quality should ensure customer satisfaction. Although many studies have examined the role of e-Government and the quality of its services, few studies have examined the quality of e-Government services in terms of both supply and demand. This paper examines and reviews the academic-state-of-the-art on the factors that affect the quality of e-Government services from both perspectives. Utilizing both qualitative and quantitative methods for data gathering and analysis, a mixed methods research methodology was used. -
Rania Alsulami;Raghad Albalawi;Manal Albalawi;Hetaf Alsugair;Khaled A. Alblowi;Adel R. Alharbi 109
With the increasing interest in blockchain technology and its employment in diverse sectors and industries, including: finance, business, voting, industrial and many other medical and educational applications. Recently, the blockchain technology has played significant role in preventing fraud transactions in accounting systems, as the blockchain offers high security measurements, reduces the need for centralized processing, and blocks access to the organization information and system. Therefore, this paper studies, analyses, and investigates the adoption of blockchain technology with accounting systems, through analyzing the results of several research works which have employed the blockchain technology to secure their accounting systems. In addition, we investigate the performance of applying the deep learning and machine learning approaches for the purpose of fraud detection and classification. As a result of this study, the adoption of blockchain technology will enhance the safety and security of accounting systems, through identifying and classifying the possible frauds that may attack the accounting and business organizations. -
Suguru Kuniyoshi;Shiho Oshiro;Gennan Hayashi;Tomohisa Wada 121
A 500 km/h linear motor high speed terrestrial transportation service is planned to launch 2027 in Japan. In order to support 5G service in the train, the Sub-carrier spacing frequency of 30 kHz is planned to be used instead of common 15 kHz sub-carrier spacing to mitigate Doppler effect in such high-speed transportation. In addition, to increase the cell size of 5G mobile system, plural Base Station antenna will transmit the identical Down Link (DL) signal to form the expanded cell size along the train rail. In this situation, forward and backward antenna signals will be Doppler shifted by reverse direction respectively and the receiver in the train might suffer to estimate accurate Channel Transfer Function (CTF) for its demodulation. In this paper, Delay and Doppler Profiler (DDP) based Channel Estimator is proposed and it is successfully implemented in signal processing simulation system. Then the simulated performances are compared with the conventional Time domain linear interpolated estimator. According to the simulation results, QPSK modulation can be used even under severe channel condition such as 500 km/h, 2 path reverse Doppler Shift condition, although QPSK modulation can be used less than 200 km/h with conventional Channel estimator. -
The widespread usage of blockchain technology in cryptocurrencies has led to the adoption of the blockchain concept in data storage management systems for secure and effective data storage and management. Several innovative studies have proposed solutions that integrate blockchain with distributed databases. In this article, we review current blockchain databases, then focus on two well-known blockchain databases-BigchainDB and FalconDB-to illustrate their architecture and design aspects in more detail. BigchainDB is a distributed database that integrates blockchain properties to enhance immutability and decentralization as well as a high transaction rate, low latency, and accurate queries. Its architecture consists of three layers: the transaction layer, consensus layer, and data model layer. FalconDB, on the other hand, is a shared database that allows multiple clients to collaborate on the database securely and efficiently, even if they have limited resources. It has two layers: the authentication layer and the consensus layer, which are used with client requests and results. Finally, a comparison is made between the two blockchain databases, revealing that they share some characteristics such as immutability, low latency, permission, horizontal scalability, decentralization, and the same consensus protocol. However, they vary in terms of database type, concurrency mechanism, replication model, cost, and the usage of smart contracts.
-
Muhammad Saifullah ;Imran Sarwar Bajwa;Muhammad Ibrahim;Mutyyba Asgher 135
Internet of things has revolutionaries every field of life due to the use of artificial intelligence within Machine Learning. It is successfully being used for the study of Radiation monitoring, prediction of Ultraviolet and Electromagnetic rays. However, there is no particular system available that can monitor and detect waves. Therefore, the present study designed in which IOT enables intelligence system based on machine learning was developed for the prediction of the radiation and their effects of human beings. Moreover, a sensor based system was installed in order to detect harmful radiation present in the environment and this system has the ability to alert the humans within the range of danger zone with a buzz, so that humans can move to a safer place. Along with this automatic sensor system; a self-created dataset was also created in which sensor values were recorded. Furthermore, in order to study the outcomes of the effect of these rays researchers used Support Vector Machine, Gaussian Naïve Bayes, Decision Trees, Extra Trees, Bagging Classifier, Random Forests, Logistic Regression and Adaptive Boosting Classifier were used. To sum up the whole discussion it is stated the results give high accuracy and prove that the proposed system is reliable and accurate for the detection and monitoring of waves. Furthermore, for the prediction of outcome, Adaptive Boosting Classifier has shown the best accuracy of 81.77% as compared with other classifiers. -
Rawia Elarabi;Abdelrahman Elsharif Karrar;Murtada El-mukashfi El-taher 148
Classification systems can significantly assist the medical sector by allowing for the precise and quick diagnosis of diseases. As a result, both doctors and patients will save time. A possible way for identifying risk variables is to use machine learning algorithms. Non-surgical technologies, such as machine learning, are trustworthy and effective in categorizing healthy and heart-disease patients, and they save time and effort. The goal of this study is to create a medical intelligent decision support system based on machine learning for the diagnosis of heart disease. We have used a mixed feature creation (MFC) technique to generate new features from the UCI Cleveland Cardiology dataset. We select the most suitable features by using Least Absolute Shrinkage and Selection Operator (LASSO), Recursive Feature Elimination with Random Forest feature selection (RFE-RF) and the best features of both LASSO RFE-RF (BLR) techniques. Cross-validated and grid-search methods are used to optimize the parameters of the estimator used in applying these algorithms. and classifier performance assessment metrics including classification accuracy, specificity, sensitivity, precision, and F1-Score, of each classification model, along with execution time and RMSE the results are presented independently for comparison. Our proposed work finds the best potential outcome across all available prediction models and improves the system's performance, allowing physicians to diagnose heart patients more accurately. -
The intelligent transportation system has made a huge leap in the level of human services, which has had a positive impact on the quality of life of users. On the other hand, these services are becoming a new source of risk due to the use of data collected from vehicles, on which intelligent systems rely to create automatic contextual adaptation. Most of the popular privacy protection methods, such as Dummy and obfuscation, cannot be used with many services because of their impact on the accuracy of the service provided itself, they depend on changing the number of vehicles or their physical locations. This research presents a new approach based on the shuffling Nicknames of vehicles. It fully maintains the quality of the service and prevents tracking users permanently, penetrating their privacy, revealing their whereabouts, or discovering additional details about the nature of their behavior and movements. Our approach is based on creating a central Nicknames Pool in the cloud as well as distributed subpools in fog nodes to avoid intelligent delays and overloading of the central architecture. Finally, we will prove by simulation and discussion by examples the superiority of the proposed approach and its ability to adapt to new services and provide an effective level of protection. In the comparison, we will rely on the wellknown privacy criteria: Entropy, Ubiquity, and Performance.
-
In this paper, performance of autoencoder based OFDM communication systems is compared with IEEE 802.11a Wireless Lan System (Wi-Fi). The proposed autoencoder based OFDM system is composed of the following steps. First, one sub-carrier's transmitter - channel - receiver system is created by autoencoder. Then learning process of the one sub-carrier autoencoder generates constellation map. Secondly, using the plural sub-carrier autoencoder systems, parallel bundle is configured with inserting IFFT and FFT before and after the channel to configure OFDM system. Finally, the receiver part of the OFDM communication system was updated by re-learning process for adapting channel condition such as multipath channel. For performance comparison, IEEE802.11a and the proposed autoencoder based OFDM system are compared. For channel estimation, Wi-Fi uses initial long preamble to measure channel condition. but Autoencoder needs re-learning process to create an equalizer which compensate a distortion caused by the transmission channel. Therefore, this autoencoder based system has basic advantage to the Wi-Fi system. For the comparison of the system, additive random noise and 2-wave and 4-wave multipaths are assumed in the transmission path with no inter-symbol interference. A simulation was performed to compare the conventional type and the autoencoder. As a result of the simulation, the autoencoder properly generated automatic constellations with QPSK, 16QAM, and 64QAM. In the previous simulation, the received data was relearned, thus the performance was poor, but the performance improved by making the initial value of reception a random number. A function equivalent to an equalizer for multipath channels has been realized in OFDM systems. As a future task, there is not include error correction at this time, we plan to make further improvements by incorporating error correction in the future.
-
The widespread use of Cloud Computing, Internet of Things (IoT), and social media in the Information Communication Technology (ICT) field has resulted in continuous and unavoidable cyber-attacks on users and critical infrastructures worldwide. Traditional security measures such as firewalls and encryption systems are not effective in countering these sophisticated cyber-attacks. Therefore, Intrusion Detection and Prevention Systems (IDPS) are necessary to reduce the risk to an absolute minimum. Although IDPSs can detect various types of cyber-attacks with high accuracy, their performance is limited by a high false alarm rate. This study proposes a new technique called Fuzzy Logic - Objective Risk Analysis (FLORA) that can significantly reduce false positive alarm rates and maintain a high level of security against serious cyber-attacks. The FLORA model has a high fuzzy accuracy rate of 90.11% and can predict vulnerabilities with a high level of certainty. It also has a mechanism for monitoring and recording digital forensic evidence which can be used in legal prosecution proceedings in different jurisdictions.
-
In recent years, and with the increased adoption of digital transformation and spending long hours in front of these devices, clinicians have observed that the prolonged use of visual display units (VDUs) can result in a certain symptom complex, which has been defined as computer vision syndrome (CVS). This syndrome has been affected by many causes, such as light refractive errors, poor computer design, workplace ergonomics, and a highly demanding visual task. This research focuses on eliminating one of CVSs, which is the eye dry syndrome caused by infrequent eye blink rate while using a smart device for a long time. This research attempt to find a limitation on the current tools. In addition, exploring the other use cases to utilize the solution based on each vertical and needs.
-
Mohsin Shaikh;Irfan Ali Tunio;Syed Muhammad Shehram Shah;Fareesa Khan Sohu;Abdul Aziz;Ahmad Ali 207
Traditional methods for datamining typically assume that the data is small, centralized, memory resident and static. But this assumption is no longer acceptable, because datasets are growing very fast hence becoming huge from time to time. There is fast growing need to manage data with efficient mining algorithms. In such a scenario it is inevitable to carry out data mining in a distributed environment and Frequent Itemset Mining (FIM) is no exception. Thus, the need of an efficient incremental mining algorithm arises. We propose the Distributed Incremental Approximate Frequent Itemset Mining (DIAFIM) which is an incremental FIM algorithm and works on the distributed parallel MapReduce environment. The key contribution of this research is devising an incremental mining algorithm that works on the distributed parallel MapReduce environment. -
Ashour Ali;Shahrul Azman Mohd Noah;Lailatul Qadri Zakaria 212
Ontologies are knowledge containers in which information about a specified domain can be shared and reused. An event happens within a specific time and place and in which some actors engage and show specific action features. The fact is that several ontology models are based on events called Event-Based Models, where the event is an individual entity or concept connected with other entities to describe the underlying ontology because the event can be composed of spatiotemporal extents. However, current event-based ontologies are inadequate to bridge the gap between spatiotemporal extents and participants to describe a specific domain event. This paper reviews, describes, and compares the existing event-based ontologies. The paper compares and contrasts various ways of representing the events and how they have been modelled, constructed, and integrated with the ontologies. The primary criterion for comparison is based on the events' ability to represent spatial and temporal extent and the participants in the event.