Volume 23 Issue 10
-
Suguru Kuniyoshi;Rie Saotome;Shiho Oshiro;Tomohisa Wada 1
This paper proposes a method to extend Inter-Carrier Interference (ICI) canceling Orthogonal Frequency Division Multiplexing (OFDM) receivers for 5G mobile systems to spatial multiplexing 2×2 MIMO (Multiple Input Multiple Output) systems to support high-speed ground transportation services by linear motor cars traveling at 500 km/h. In Japan, linear-motor high-speed ground transportation service is scheduled to begin in 2027. To expand the coverage area of base stations, 5G mobile systems in high-speed moving trains will have multiple base station antennas transmitting the same downlink (DL) signal, forming an expanded cell size along the train rails. 5G terminals in a fast-moving train can cause the forward and backward antenna signals to be Doppler-shifted in opposite directions, so the receiver in the train may have trouble estimating the exact channel transfer function (CTF) for demodulation. A receiver in such high-speed train sees the transmission channel which is composed of multiple Doppler-shifted propagation paths. Then, a loss of sub-carrier orthogonality due to Doppler-spread channels causes ICI. The ICI Canceller is realized by the following three steps. First, using the Demodulation Reference Symbol (DMRS) pilot signals, it analyzes three parameters such as attenuation, relative delay, and Doppler-shift of each multi-path component. Secondly, based on the sets of three parameters, Channel Transfer Function (CTF) of sender sub-carrier number n to receiver sub-carrier number l is generated. In case of n≠l, the CTF corresponds to ICI factor. Thirdly, since ICI factor is obtained, by applying ICI reverse operation by Multi-Tap Equalizer, ICI canceling can be realized. ICI canceling performance has been simulated assuming severe channel condition such as 500 km/h, 8 path reverse Doppler Shift for QPSK, 16QAM, 64QAM and 256QAM modulations. In particular, 2×2MIMO QPSK and 16QAM modulation schemes, BER (Bit Error Rate) improvement was observed when the number of taps in the multi-tap equalizer was set to 31 or more taps, at a moving speed of 500 km/h and in an 8-pass reverse doppler shift environment. -
The primary aim of this study is to investigate the influence of Service Provider Quality (SPQ), System Quality (SQ), Information Quality (IQ), and Training Quality (TQ) on the interconnected aspect of organizational performance known as growth and development (GD). The study examined the influence of information systems (IS) on organisational performance and provided a theory-based technique for conducting research. The theoretical foundation for this study is derived from the widely employed [1]. IS success model in information systems research. The study's framework incorporates several novel elements, drawn from a comprehensive review of both recent and earlier literature, which researchers have utilized to evaluate the dimensions of [1]. In this study, we collected data from a diverse group of 348 individuals representing various industries through a web-based questionnaire. The collected data were subjected to analysis using SPSS. We conducted a multiple regression analysis involving 15 factors to assess several hypotheses regarding the relationship between the independent construct IS effectiveness and the dependent construct organizational performance. Several noteworthy descriptive statistics emerged, which hold significance for management. The study's findings strongly indicate that information systems exert a significant and beneficial influence on organizational performance. To sustain and continually enhance organizational effectiveness, the study recommends that managers periodically scrutinize and assess their information systems.
-
Recently there has been an increase in interest and competition in sharing economy platforms. The success stories of many companies have spread in this field, such as Uber, but on the other hand, there are many other companies that have failed. We studied and analyzed the factors that affect the user's participation in the sharing economy platforms as an essential part of this system, and how to maintain the consumer's intention of use without compromising consumer satisfaction, as it has become an issue of great importance on the path to the success of the sharing platforms. Relying on the expanded valence framework and expectation confirmation theory as a basis, we constructed hypotheses that influence intention to participate in participatory platforms. Results show that system quality, trust, perceived benefits, and satisfaction are important factors that positively influence intention to continue to participate. This research is expected to help researchers move forward with research related to the future and help business managers understand user insights and integrate them with their business model to help the success, development and expansion of their business in the Kingdom of Saudi Arabia.
-
In today's worldwide environment, information systems (IS) usage is growing swiftly. As a result, it now affects every aspect of life and serves as a general growth tool for individuals, groups, and governments. information system success (ISS) is affected by customer satisfaction and their acceptance of using these services. In addition, this issue will be a critical thing for SMEs, especially in Saudi Arabia. SMEs have a shortage and lack IT experience and resources. The research's question is What are the ISS that will improve customer satisfaction and SME performance in Saudi Arabia. Through an online survey, The data on how Saudi SMEs succeed in IS was acquired. Citizens and residents users in Saudi Arabia, representing a range of ages and educational backgrounds. In the IS success factors evaluation, which assessed the degree of agreeability and disagreeability of specific statements related to the six dimensions based on the empirical data, it was found that the users agreed with the majority of the claims. For users, usability is the most important feature. This study discovered that enhancing the system's overall user experience might lead to higher overall satisfaction.
-
The hospital situation, timing, and patient restrictions have become obstacles to an optimum therapy session. The crowdedness of the hospital might lead to a tight schedule and a shorter period of therapy. This condition might strike a post-stroke patient in a dilemma where they need regular treatment to recover their nervous system. In this work, we propose an in-house and uncomplex serious game system that can be used for physical therapy. The Kinect camera is used to capture the depth image stream of a human skeleton. Afterwards, the user might use their hand gesture to control the game. Voice recognition is deployed to ease them with play. Users must complete the given challenge to obtain a more significant outcome from this therapy system. Subjects will use their upper limb and hands to capture the 3D objects with different speeds and positions. The more substantial challenge, speed, and location will be increased and random. Each delegated entity will raise the scores. Afterwards, the scores will be further evaluated to correlate with therapy progress. Users are delighted with the system and eager to use it as their daily exercise. The experimental studies show a comparison between score and difficulty that represent characteristics of user and game. Users tend to quickly adapt to easy and medium levels, while high level requires better focus and proper synchronization between hand and eye to capture the 3D objects. The statistical analysis with a confidence rate(α:0.05) of the usability test shows that the proposed gaming is accessible, even without specialized training. It is not only for therapy but also for fitness because it can be used for body exercise. The result of the experiment is very satisfying. Most users enjoy and familiarize themselves quickly. The evaluation study demonstrates user satisfaction and perception during testing. Future work of the proposed serious game might involve haptic devices to stimulate their physical sensation.
-
Tagred A. Alkasmy;Rehab K. Qarout;Kaouther Laabidi 44
Augmented Reality (AR) is an emerging technology and a vibrant field, it has become common in application development, especially in smartphone applications (mobile phones). The AR technology has grown increasingly during the past decade in many fields. Therefore, it is necessary to determine the optimal approach to building the final product by evaluating the performance of each of them separately at a specific task. In this work we evaluated overall CPU and RAM performance for several types of Markerless Augmented Reality applications by using a multiple-objects in mobile development. The results obtained are show that the objects with fewer number of vertices performs steady and not oscillating. Object was superior to the rest of the others is sphere, which is performs better values when processed, its values closer to the minimum CPU and RAM usage. -
Sarah AlBarakati;Sally AlQarni;Rehab K. Qarout;Kaouther Laabidi 49
Computer architecture serves as a link between application requirements and underlying technology capabilities such as technical, mathematical, medical, and business applications' computational and storage demands are constantly increasing. Machine learning these days grown and used in many fields and it performed better than traditional computing in applications that need to be implemented by using mathematical algorithms. A mathematical algorithm requires more extensive and quicker calculations, higher computer architecture specification, and takes longer execution time. Therefore, there is a need to improve the use of computer hardware such as CPU, memory, etc. optimization has a main role to reduce the execution time and improve the utilization of computer recourses. And for the importance of execution time in implementing machine learning supervised module linear regression, in this paper we focus on optimizing machine learning algorithms, for this purpose we write a (Diabetes prediction program) and applying on it a Practical Swarm Optimization (PSO) to reduce the execution time and improve the utilization of computer resources. Finally, a massive improvement in execution time were observed. -
Adil CHEBIR ;Ibtissam EL MOURY;Adil ECHCHELH;Omar TAOUAB 57
Since 2009, Morocco has had a law governing the processing of personal data, the law 09-08, and a supervisory authority, the CNDP (National Commission for the Protection of Personal Data). Since May 2018, the European General Regulation on the Protection of Personal Data (GDPR) entered into force, which applies outside the EU in certain cases and therefore to certain Moroccan companies. The question of the protection of personal data is primarily addressed to the customer. The latter may not only be a victim of crime linked to ICT, but also have to face risks linked to the collection and abusive processing of his personal data by the private and public sectors. Often the customer does not really know how their data is stored, nor for how long and for what purpose. This fact raises the question of satisfying customer requirements, in particular for organizations that have adopted a quality approach based on ISO 9001 standard.In order to master these constraints, Moroccan companies have to adopt strategies based on modern quality management techniques, especially the adoption of principles issued from the international standard ISO 9001 while being confirmed by the law 09-08. It is through ISO 9001 and the law 09-08 that these companies can refer to recognized approaches in terms of quality and compliance. The major challenge for these companies is to have a Quality approach that allows the coexistence between the law 09-08 and ISO 9001 standard and this article deals within this specific context. -
This paper portrays the design and optimization of a wideband four element triangular dielectric resonator antenna (TDRA) using PSO. The proposed antenna's radiation characteristics were extracted using Ansoft HFSS software. At a resonant frequency of 5-7 GHz, the four element antenna provides nearly 21 percent bandwidth and the optimized gives 5.82 dBi peak gain. The radiation patterns symmetry and uniformity are maintained throughout the operating bandwidth. for WLAN (IEEE 802.16) and WiMAX applications, the proposed antenna exhibits a consistent symmetric monopole type radiation pattern with low cross polarisation. The proposed antenna's performance was compared to that of other dielectric resonator antenna (DRA) shapes, and it was discovered that the TDRA uses a lot less radiation area to provide better performance than other DRA shapes and PSO optimized antenna increases the gain of the antenna
-
The recent growth in the use of mobile devices has contributed to increased computing and storage requirements. Cloud computing has been used over the past decade to cater to computational and storage needs over the internet. However, the use of various mobile applications like Augmented Reality (AR), M2M Communications, V2X Communications, and the Internet of Things (IoT) led to the emergence of mobile cloud computing (MCC). All data from mobile devices is offloaded and computed on the cloud, removing all limitations incorporated with mobile devices. However, delays induced by the location of data centers led to the birth of edge computing technologies. In this paper, we discuss one of the edge computing technologies, i.e., cloudlet. Cloudlet brings the cloud close to the end-user leading to reduced delay and response time. An algorithm is proposed for scheduling tasks on cloudlet by considering VM's load. Simulation results indicate that the proposed algorithm provides 12% and 29% improvement over EMACS and QRR while balancing the load.
-
In the present scenario, enormous amounts of data are produced every second. These data also contain private information from sources including media platforms, the banking sector, finance, healthcare, and criminal histories. Data mining is a method for looking through and analyzing massive volumes of data to find usable information. Preserving personal data during data mining has become difficult, thus privacy-preserving data mining (PPDM) is used to do so. Data perturbation is one of the several tactics used by the PPDM data privacy protection mechanism. In Perturbation, datasets are perturbed in order to preserve personal information. Both data accuracy and data privacy are addressed by it. This paper will explore and compare several perturbation strategies that may be used to protect data privacy. For this experiment, two perturbation techniques based on random projection and principal component analysis were used. These techniques include Improved Random Projection Perturbation (IRPP) and Enhanced Principal Component Analysis based Technique (EPCAT). The Naive Bayes classification algorithm is used for data mining approaches. These methods are employed to assess the precision, run time, and accuracy of the experimental results. The best perturbation method in the Nave-Bayes classification is determined to be a random projection-based technique (IRPP) for both the cardiovascular and hypothyroid datasets.
-
Intrusion detection has been widely studied in both industry and academia, but cybersecurity analysts always want more accuracy and global threat analysis to secure their systems in cyberspace. Big data represent the great challenge of intrusion detection systems, making it hard to monitor and analyze this large volume of data using traditional techniques. Recently, deep learning has been emerged as a new approach which enables the use of Big Data with a low training time and high accuracy rate. In this paper, we propose an approach of an IDS based on cloud computing and the integration of big data and deep learning techniques to detect different attacks as early as possible. To demonstrate the efficacy of this system, we implement the proposed system within Microsoft Azure Cloud, as it provides both processing power and storage capabilities, using a convolutional neural network (CNN-IDS) with the distributed computing environment Apache Spark, integrated with Keras Deep Learning Library. We study the performance of the model in two categories of classification (binary and multiclass) using CSE-CIC-IDS2018 dataset. Our system showed a great performance due to the integration of deep learning technique and Apache Spark engine.
-
Misbah Iram;Saif Ur Rehman;Shafaq Shahid;Sayeda Ambreen Mehmood 97
Sentiment analysis using social network platforms such as Twitter has achieved tremendous results. Twitter is an online social networking site that contains a rich amount of data. The platform is known as an information channel corresponding to different sites and categories. Tweets are most often publicly accessible with very few limitations and security options available. Twitter also has powerful tools to enhance the utility of Twitter and a powerful search system to make publicly accessible the recently posted tweets by keyword. As popular social media, Twitter has the potential for interconnectivity of information, reviews, updates, and all of which is important to engage the targeted population. In this work, numerous methods that perform a classification of tweet sentiment in Twitter is discussed. There has been a lot of work in the field of sentiment analysis of Twitter data. This study provides a comprehensive analysis of the most standard and widely applicable techniques for opinion mining that are based on machine learning and lexicon-based along with their metrics. The proposed work is helpful to analyze the information in the tweets where opinions are highly unstructured, heterogeneous, and polarized positive, negative or neutral. In order to validate the performance of the proposed framework, an extensive series of experiments has been performed on the real world twitter dataset that alter to show the effectiveness of the proposed framework. This research effort also highlighted the recent challenges in the field of sentiment analysis along with the future scope of the proposed work. -
The study provides the identification of vulnerabilities in the security issues by Wireless Network. To achieve it the research focus on packet flow analysis, end to end data communication, and the security challenges (Cybercrime, insider threat, attackers, hactivist, malware and Ransomware). To solve this I have used the systematic literature review mechanisms and demonstrative tool namely Wireshark network analyzer. The practical demonstration identifies the packet flow, packet length time, data flow statistics, end- to- end packet flow, reached and lost packets in the network and input/output packet statics graphs. Then, I have developed the proposed model that used to secure the Wireless network solution and prevention vulnerabilities of the network security challenges. And applying the model that used to investigate the security challenges and vulnerabilities of cloud computing services is used to fulfill the network security goals in Wireless network. Finally the research provides the model that investigate the security challenges and vulnerabilities of cloud computing services in wireless networks
-
Ajeet K. Jain;PVRD Prasad Rao ;K. Venkatesh Sharma 115
Deep learning has been incorporating various optimization techniques motivated by new pragmatic optimizing algorithm advancements and their usage has a central role in Machine learning. In recent past, new avatars of various optimizers are being put into practice and their suitability and applicability has been reported on various domains. The resurgence of novelty starts from Stochastic Gradient Descent to convex and non-convex and derivative-free approaches. In the contemporary of these horizons of optimizers, choosing a best-fit or appropriate optimizer is an important consideration in deep learning theme as these working-horse engines determines the final performance predicted by the model. Moreover with increasing number of deep layers tantamount higher complexity with hyper-parameter tuning and consequently need to delve for a befitting optimizer. We empirically examine most popular and widely used optimizers on various data sets and networks-like MNIST and GAN plus others. The pragmatic comparison focuses on their similarities, differences and possibilities of their suitability for a given application. Additionally, the recent optimizer variants are highlighted with their subtlety. The article emphasizes on their critical role and pinpoints buttress options while choosing among them. -
Rania A. Tabeidi;Hanaa F. Morse;Samia M. Masaad;Reem H. Al-shammari;Dalia M. Alsaffar 129
The greatest challenge of this century is the protection of stored and transmitted data over the network. This paper provides a new hybrid algorithm designed based on combination algorithms, in the proposed algorithm combined with Hill and the Advanced Encryption Standard Algorithms, to increase the efficiency of color image encryption and increase the sensitivity of the key to protect the RGB image from Keyes attackers. The proposed algorithm has proven its efficiency in encryption of color images with high security and countering attacks. The strength and efficiency of combination the Hill Chipper and Advanced Encryption Standard Algorithms tested by statical analysis for RGB images histogram and correlation of RGB images before and after encryption using hill cipher and proposed algorithm and also analysis of the secret key and key space to protect the RGB image from Brute force attack. The result of combining Hill and Advanced Encryption Standard Algorithm achieved the ability to cope statistically -
An effective educational program warrants the inclusion of an innovative construction which enhances the higher education efficacy in such a way that accelerates the achievement of desired results and reduces the risk of failures. Educational Decision Support System (EDSS) has currently been a hot topic in educational systems, facilitating the pupil result monitoring and evaluation to be performed during their development. Insufficient information systems encounter trouble and hurdles in making the sufficient advantage from EDSS owing to the deficit of accuracy, incorrect analysis study of the characteristic, and inadequate database. DMTs (Data Mining Techniques) provide helpful tools in finding the models or forms of data and are extremely useful in the decision-making process. Several researchers have participated in the research involving distributed data mining with multi-agent technology. The rapid growth of network technology and IT use has led to the widespread use of distributed databases. This article explains the available data mining technology and the distributed data mining system framework. Distributed Data Mining approach is utilized for this work so that a classifier capable of predicting the success of students in the economic domain can be constructed. This research also discusses the Intelligent Knowledge Base Distributed Data Mining framework to assess the performance of the students through a mid-term exam and final-term exam employing Multi-agent system-based educational mining techniques. Using single and ensemble-based classifiers, this study intends to investigate the factors that influence student performance in higher education and construct a classification model that can predict academic achievement. We also discussed the importance of multi-agent systems and comparative machine learning approaches in EDSS development.
-
ZAIN UL ABEDIN;Muhammad Shujat Ali;Ashraf Ali;Sana Ejaz 147
Electronic voting machines (EVMs) are replacing research ballots due to the errors involved in the manual counting process and the lengthy time required to count the votes. Even though these digital recording electronic systems are advancements, they are vulnerable to tampering and electoral fraud. The suspected vulnerabilities in EVMs are the possibility of tampering with the EVM's memory chip or replacing it with a fake one, their simplicity, which allows them to be tampered with without requiring much skill, and the possibility of double voting. The vote data is shared among all network devices, and peer-to-peer verification is performed to ensure the vote data's authenticity. To successfully tamper with the system, all of the data stored in the nodes must be changed. This improves the proposed system's efficiency and dependability. Elections and voting are fundamental components of a democratic system. Various attempts have been made to make modern elections more flexible by utilizing digital technologies. The fundamental characteristics of free and fair elections are intractability, immutability, transparency, and the privacy of the actors involved. This corresponds to a few of the many characteristics of blockchain-like decentralized ownership, such as chain immutability, anonymity, and distributed ledger. This working research attempts to conduct a comparative analysis of various blockchain technologies in development and propose a 'Blockchain-based Electronic Voting System' solution by weighing these technologies based on the need for the proposed solution. The primary goal of this research is to present a robust blockchain-based election mechanism that is not only reliable but also adaptable to current needs. -
The attack technique targeting end-users through phishing URLs is very dangerous nowadays. With this technique, attackers could steal user data or take control of the system, etc. Therefore, early detecting phishing URLs is essential. In this paper, we propose a method to detect phishing URLs based on supervised learning algorithms and abnormal behaviors from URLs. Finally, based on the research results, we build a framework for detecting phishing URLs through end-users. The novelty and advantage of our proposed method are that abnormal behaviors are extracted based on URLs which are monitored and collected directly from attack campaigns instead of using inefficient old datasets.
-
There are many possible ways to configure database management systems (DBMSs) have challenging to manage and set.The problem increased in large-scale deployments with thousands or millions of individual DBMS that each have their setting requirements. Recent research has explored using machine learning-based (ML) agents to overcome this problem's automated tuning of DBMSs. These agents extract performance metrics and behavioral information from the DBMS and then train models with this data to select tuning actions that they predict will have the most benefit. This paper discusses two engineering approaches for integrating ML agents in a DBMS. The first is to build an external tuning controller that treats the DBMS as a black box. The second is to incorporate the ML agents natively in the DBMS's architecture.
-
Wireless sensor network is wieldy use for IoT application. The sensor node consider as physical device in IoT architecture. This all sensor node are operated with battery so the power consumption is very high during the data communication and low during the sensing the environment. Without proper planning of data communication the network might be dead very early so primary objective of the cluster based routing protocol is to enhance the battery life and run the application for longer time. In this paper we have comprehensive of twenty research paper related with clustering based routing protocol. We have taken basic information, network simulation parameters and performance parameters for the comparison. In particular, we have taken clustering manner, node deployment, scalability, data aggregation, power consumption and implementation cost many more points for the comparison of all 20 protocol. Along with basic information we also consider the network simulation parameters like number of nodes, simulation time, simulator name, initial energy and communication range as well energy consumption, throughput, network lifetime, packet delivery ration, jitter and fault tolerance parameters about the performance parameters. Finally we have summarize the technical aspect and few common parameter must be fulfill or consider for the design energy efficient cluster based routing protocol.
-
Nourah Almrezeq;Mamoona Humayun;Madallah Alruwaili;Saad Alanazi;NZ Jhanjhi 179
The outbreak of COVID-19 had affected almost every part of the world and caused disastrous results, the number of reported COVID-19 cases in past few months have reached to more than 29 million patients in the world globally. This pandemic has adversely affected all the activities of life, ranging from personal life to overall economic development. Due to the current situation, routinely turned to online resources, and people have relied on technology more than they have been before. Since cybercriminals are an opportunist and they utilized this entirely, by targeting the online services for all sectors of life. This fortnight online dependency of the community over the internet opened several easy doors for the cybercriminals. This causes exponential attacks over internet traffic during this epidemic situation. The current Covid-19 pandemic situation appeared at once, and no one was ready to prevail this. However, there is an urgent need to address the current problem in all means. . KSA is among one of the countries most affected by these CA and is a key victim for most cyber-crimes. Therefore, this paper will review the effects of COVID-19 on the cyber-world of KSA in various sectors. We will also shed light on the Saudi efforts to confront these attacks during COVID -19. As a contribution, we have provided a comprehensive framework for mitigating cybersecurity challenges. -
In this era of globalization without limitation, many security issues occur, especially in a public network. Internet users significantly increased every single day. However, only some users are aware of security issues when they use Internet services. For the campus network environment, both staffs and students are susceptible to security threats such as data theft, unauthorized access and more due to different levels of awareness towards security threats. This paper is to study the level of awareness among students on security issues based on KSA model. As a case study, the survey was distributed among students in the UTeM campus network. A quantitative study was conducted, and a structured questionnaire has been designed and distributed among students. The variables were focused on three (3) aspects, which are Knowledge, Skill and Ability (KSA). The finding shows the relationship between KSA with the level of awareness among students has been revealed. From the result, Knowledge is the most significant aspect that contributes to high awareness. For the future, a study about increasing students' knowledge about security issues should be addressed.
-
Alwalid Alhashem;Aiman Abdulbaset ;Faisal Almudarra ;Hazzaa Alshareef ;Mshari Alqasoumi ;Atta-ur Rahman ;Maqsood Mahmud 199
The emergence of COVID-19 virus has shaken almost every aspect of human life including but not limited to social, financial, and economic changes. One of the most significant impacts was obviously healthcare. Now though the pandemic has been over, its aftereffects are still there. Among them, a prominent one is people lifestyle. Work from home, enhanced screen time, limited mobility and walking habits, junk food, lack of sleep etc. are several factors that have still been affecting human health. Consequently, diseases like diabetes, high blood pressure, anxiety etc. have been emerging at a speed never witnessed before and it mainly includes the people at young age. The situation demands an early prediction, detection, and warning system to alert the people at risk. AI and Machine learning has been investigated tremendously for solving the problems in almost every aspect of human life, especially healthcare and results are promising. This study focuses on reviewing the machine learning based approaches conducted in detection and prediction of diabetes especially during and post pandemic era. That will help find a research gap and significance of the study especially for the researchers and scholars in the same field. -
In recent years, the image processing mechanisms are used widely in several medical areas for improving earlier detection and treatment stages, in which the time factor is very important to discover the disease in the patient as possible as fast, especially in various cancer tumors such as the liver cancer. Liver cancer has been attracting the attention of medical and sciatic communities in the latest years because of its high prevalence allied with the difficult treatment. Statistics indicate that liver cancer, throughout world, is the one that attacks the greatest number of people. Over the time, study of MR images related to cancer detection in the liver or abdominal area has been difficult. Early detection of liver cancer is very important for successful treatment. There are few methods available to detect cancerous cells. In this paper, an automatic approach that integrates the intensity-based segmentation and k-means clustering approach for detection of cancer region in MRI scan images of liver.
-
Fatmah Bayounis;Sana Dehlavi;Asmaa Azimudin;Taif Alghamdi;Aymen Akremi 214
Fraudulence, cheating, and deception can occur in the commercial real estate (CRE) industry, besides the difficulty in searching for and transferring properties while ensuring the operation is processed through an authoritative source in a trusted manner. Nowadays, real estate transactions use neutral third parties to sell land. Indeed, properties can be sold by the owners or third parties multiple times or without a proper deed. Moreover, third parties request a large amount of money to mediate between the seller and buyer. Methods: We propose a new framework that uses a private blockchain network and predefined BPMN instances to enable the fast and easy recording of deeds and their proprietary transfer management controlled by the government. The blockchain allows for multiple verifications of transactions by permitted parties called peers. It promotes transparency, privacy, trust, and commercial competition. Results: We demonstrated the easy adoption of blockchain for land registration and transfer. The paper presents a prototype of the implemented product that follows the proposed framework. Conclusion: The use of Blockchain-based solutions to resolve the current land registration and transfer issues is promising and will contribute to smart cities and digital governance.