Volume 21 Issue 1
-
Cyberbullying is a problem that is faced in many cultures. Due to their popularity and interactive nature, social media platforms have also been affected by cyberbullying. Social media users from Arab countries have also reported being a target of cyberbullying. Machine learning techniques have been a prominent approach used by scientists to detect and battle this phenomenon. In this paper, we compare different machine learning algorithms for their performance in cyberbullying detection based on a labeled dataset of Arabic YouTube comments. Three machine learning models are considered, namely: Multinomial Naïve Bayes (MNB), Complement Naïve Bayes (CNB), and Linear Regression (LR). In addition, we experiment with two feature extraction methods, namely: Count Vectorizer and Tfidf Vectorizer. Our results show that, using count vectroizer feature extraction, the Logistic Regression model can outperform both Multinomial and Complement Naïve Bayes models. However, when using Tfidf vectorizer feature extraction, Complement Naive Bayes model can outperform the other two models.
-
Soh, Ben;AlZain, Mohammed;Lozano-Claros, Diego;Adhikari, Basanta 6
Routing protocols play a pivotal role in the energy management and lifespan of any Wireless Sensor Network. Lower network lifetime has been one of the biggest concerns in LEACH protocol due to dead nodes. The LEACH protocol suffers from uneven energy distribution problem due to random selection of a cluster head. The cluster head has much greater responsibility compared to other non- cluster head nodes and consumes greater energy for its roles. This results in early dead nodes due to energy lost for the role of cluster- head. This study proposes an approach to balance the energy consumption of the LEACH protocol by using a semi-deterministic opportunity coefficient to select the cluster head. This is calculated in each node with the battery energy level and node ID. Ultimately, based on the opportunity cost, cluster head will be selected and broadcasted for which other nodes with higher opportunity cost will agree. It minimizes the chances of nodes with lower battery level being elected as cluster head. Our simulation experiments demonstrate that cluster heads chosen using our proposed algorithm perform better than those using the legacy LEACH protocol. -
Othman, Mahfudzah;Zain, Nurzaid Muhd;Paidi, Zulfikri;Pauzi, Faizul Amir 12
This paper proposes a framework for the development of the health recommender system, designed to cater COVID-19 symptoms' self-assessment and monitoring as well as to provide recommendations for self-care and medical treatments. The aim is to provide an online platform for Patient Under Investigation (PUI) and close contacts with positive COVID-19 cases in Malaysia who are under home quarantine to perform daily self-assessment in order to monitor their own symptoms' development. To achieve this, three main phases of research methods have been conducted where interviews have been done to thirty former COVID-19 patients in order to investigate the symptoms and practices conducted by the Malaysia Ministry of Health (MOH) in assessing and monitoring COVID-19 patients who were under home quarantine. From the interviews, an algorithm using user-based collaborative filtering technique with Pearson correlation coefficient similarity measure is designed to cater the self-assessment and symptoms monitoring as well as providing recommendations for self-care treatments as well as medical interventions if the symptoms worsen during the 14-days quarantine. The proposed framework will involve the development of the health recommender system for COVID-19 self-assessment and treatments using the progressive web application method with cloud database and PHP codes. -
Ali, Shahad M.;Alshahrani, Razan F.;Hadadi, Amjad H.;Alghamdi, Tahany A.;Almuhsin, Fatimah H.;El-Sharawy, Enas E. 19
CPU is considered the main and most important resource in the computer system. The CPU scheduling is defined as a procedure that determines which process will enter the CPU to be executed, and another process will be waiting for its turn to be performed. CPU management scheduling algorithms are the major service in the operating systems that fulfill the maximum utilization of the CPU. This article aims to review the studies on the CPU scheduling algorithms towards comparing which is the best algorithm. After we conducted a review of the Round Robin, Shortest Job First, First Come First Served, and Priority algorithms, we found that several researchers have suggested various ways to improve CPU optimization criteria through different algorithms to improve the waiting time, response time, and turnaround time but there is no algorithm is better in all criteria. -
Mallick, Shrabani;Verma, Ashish Kumar;Kushwaha, Dharmender Singh 27
The whole world now is dealing with Coronavirus, and it has turned to be one of the most widespread and long-lived pandemics of our times. Reports reveal that the infectious disease has taken toll of the almost 80% of the world's population. Amidst a lot of research going on with regards to the prediction on growth and transmission through Symptomatic carriers of the virus, it can't be ignored that pre-symptomatic and asymptomatic carriers also play a crucial role in spreading the reach of the virus. Classification Algorithm has been widely used to classify different types of COVID-19 carriers ranging from simple feature-based classification to Convolutional Neural Networks (CNNs). This research paper aims to present a novel technique using a Random Forest Machine learning algorithm with hyper-parameter tuning to classify different types COVID-19-carriers such that these carriers can be accurately characterized and hence dealt timely to contain the spread of the virus. The main idea for selecting Random Forest is that it works on the powerful concept of "the wisdom of crowd" which produces ensemble prediction. The results are quite convincing and the model records an accuracy score of 99.72 %. The results have been compared with the same dataset being subjected to K-Nearest Neighbour, logistic regression, support vector machine (SVM), and Decision Tree algorithms where the accuracy score has been recorded as 78.58%, 70.11%, 70.385,99% respectively, thus establishing the concreteness and suitability of our approach. -
Vidanage, Kaneeka;Noor, Noor Maizura Mohamad;Mohemad, Rosmayati;Bakar, Zuriana Aby 34
Collaborative ontology construction is the latest trend in developing ontologies. In this technique domain specialists and ontologists need to work together. Because of the complexity associated with ontology construction, it's done in an iterative and incremental fashion. After each iteration, an ontology increment will be produced. Current ontology increment is always an enhanced version of the previous increment. Each ontology increment has to be verified for its accuracy. Domain specialists' contribution is very significant in accomplishing this necessity. Unfortunately, non-computing domain specialists (i.e. medical doctors, bankers, lawyers) are illiterate on semantic concepts. Therefore, validating the accuracy of the ontology increment is a complex hurdle for them. This research proposes verbalization approach to address this complexity. -
Content-Based Image Retrieval (CBIR) system plays a vital role to retrieve the relevant images as per the user perception from the huge database is a challenging task. Images are represented is to employ a combination of low-level features as per their visual content to form a feature vector. To reduce the search time of a large database while retrieving images, a novel image retrieval technique based on feature dimensionality reduction is being proposed with the exploit of metaheuristic optimization techniques based on Genetic Algorithm (GA), Extended Binary Cuckoo Search (EBCS) and Whale Optimization Algorithm (WOA). Each image in the database is indexed using a feature vector comprising of fuzzified based color histogram descriptor for color and Median binary pattern were derived in the color space from HSI for texture feature variants respectively. Finally, results are being compared in terms of Precision, Recall, F-measure, Accuracy, and error rate with benchmark classification algorithms (Linear discriminant analysis, CatBoost, Extra Trees, Random Forest, Naive Bayes, light gradient boosting, Extreme gradient boosting, k-NN, and Ridge) to validate the efficiency of the proposed approach. Finally, a ranking of the techniques using TOPSIS has been considered choosing the best feature selection technique based on different model parameters.
-
Arutiunian, Iryna;Mishuk, Katerina;Dankevych, Natalia;Yukhymenko, Artem;Anin, Victor;Poltavets, Maryna;Sharapova, Tetiana 49
Relative to the outer surface of the mastic coating, the reliability of the available waterproofing resource is determined by the ability to stabilize the structural characteristics in difficult climatic conditions. Organic components of mastic as a result of solar radiation, elevated temperatures and their alternating change, atmospheric oxidants, especially in industrial areas, have a tendency to self-polymerization and loss of low molecular weight components. This is the gradual loss of deformability and the transition to brittleness with its tendency to crack as the reasons for the gradual transition from normal to emergency operating condition.The presented mechanism of functioning of the coating surface indicates the expediency of increasing its components, able to stabilize the structure and prevent changes in deformability.Durability, hydrophobicity, water displacement, water absorption are accepted as estimating indicators. The main dependences of the influence of the lost additional components of mastic on the operational properties of the formed coating characterize the ability to provide successful resistance to environmental influences and longer stability. As a result, mastic acquires additional service life. -
The design functionality put forward by mapping the interactiveness of information. The presentation of such information with the user interface model indicates that the guidelines, concepts, and workflows form the deliverables and milestones for achieving a visualized design, therefore forming the right trend is significant to ensure compliance in terms of changing consideration and applying evaluation in the early stages. It is evidenced that prototype design is guided by improvement specifications, includes modes, and variables that increase improvements. The study presents five user interface testing methods. The testing methods are heuristic evaluation, perspective-based user interface testing, cognitive walkthrough, pluralistic walkthrough, and formal usability inspection. It appears that the five testing methods can be combined and matched to produce reasonable results. At last, the study presents different mobile application designs for student projects besides the evaluation of mobile application designs to consider the user needs and usability.
-
In this paper, we have proposed a new internal model control structure (IMC). It is aimed at unstable overactuated multivariable systems whose transfer matrices are singular and unstable. The model inversion problem is essential to understand this structure. Indeed, the precision between the output of the process and the setpoint is linked to the quality of the inversion. This property is preserved in the presence of an additive disturbance at the output. This inversion approach proposed in this article can be applied to multivariable systems with no minimum phase or minimum phase shift with or without delays in their transfer matrices. It is proven by an example of simulation through which we have shown its good performance as a guarantee of stability, precision as well as rapidity of system responses despite the presence of external disturbances and we have tested this control structure in the frequency domain hence the robustness of the IMC.
-
El-Boghdadi, Hatem M.;Noor, Fazal;Mahmoud, Mostafa 70
The appearance of COVID-19 virus has affected many aspects of our life. These include and not limited to social, financial and economic changes. One of the most important impacts is the economic effects. Many countries have taken actions to continue the teaching process through online teaching platforms. The students are expected to graduate during the next few semesters with certificates that include some online-completed courses and their graduation certificates are called mixed certificates. This paper considers graduation mixed certificates with some online courses and its impact on graduates seeking jobs. First, we study how well the mixed certificates are accepted by job market. In other words, how different companies, organizations and even governmental entities would accept such certificates when hiring. We study the perception of job market for such certificates for different learning fields. Secondly, we study how well the online courses are accepted by the students keeping in mind that these students are used to traditional face to face teaching. Finally, we paper our results and recommendations according to the collected data from the surveys. Some of the results show that about 60% of companies don't have policies to encourage hiring graduates with mixed certificates. Also, colleges are almost divided evenly between preferring face to face and preferring online teaching. -
Alotaibi, Leena;Alnfiai, Mrim;Alhakami, Wajdi 77
The dependency on technology has increased with the increase in population. Technology plays a crucial role in facilitating, organizing and securing people's life nowadays. The Internet has penetrated every face of present-day lifestyles. Yet another ubiquitous use of digital technology today is evident in transferring money and speeding cross border payments that are done through digital transactions. This paper investigates transferring money and data through banks and companies by using the Blockchain concept through decentralized distributed system. The present research also peruses several contexts in which this technology has already been implemented successfully and demonstrates the advantages of replacing the paper money with digital money. Using cryptocurrency will facilitate people's life by reducing time, securing the process of money transfer, and increasing data integrity. The primary benefit of this content analysis is that it addresses an innovative subject, in a new light and using timely recent research references drawn from 2018-2020. Thus, our study is a contemporary and conclusive source for all present and future endeavours being undertaken in the domain of using blockchain for e-transactions. -
Certainly, the success of the Unified Modeling Language (UML) as the de facto standard for modeling software systems does not imply closing the door on scientific exploration or experimentation with modeling in the field. Continuing studies in this area can produce theoretical results that strengthen UML as the leading modeling language. Recently, a new modeling technique has been proposed called thinging machine (TM) modeling. This paper utilizes TM to further understand UML, with two objectives: (a) Fine issues in UML are studied, including theoretical notions such as events, objects, actions, activities, etc. Specifically, TM can be used to solve problems related to internal cross-diagram integration. (b) TM applies a different method of conceptualization, including building a model on one-category ontology in contrast to the object-oriented paradigm. The long-term objective of this study is to explore the possibility of TM complementing certain aspects in the UML methodology to develop and design software systems. Accordingly, we alternate between UML and TM modeling. A sample UML model is redesigned in TM, and then UML diagrams are extracted from TM. The results clarify many notions in both models. Particularly, the TM behavioral specification seems to be applicable in UML.
-
Social networking platforms have become a smart way for people to interact and meet on internet. It provides a way to keep in touch with friends, families, colleagues, business partners, and many more. Among the various social networking sites, Twitter is one of the fastest-growing sites where users can read the news, share ideas, discuss issues etc. Due to its vast popularity, the accounts of legitimate users are vulnerable to the large number of threats. Spam and Malware are some of the most affecting threats found on Twitter. Therefore, in order to enjoy seamless services it is required to secure Twitter against malicious users by fixing them in advance. Various researches have used many Machine Learning (ML) based approaches to detect spammers on Twitter. This research aims to devise a secure system based on Hybrid Similarity Cosine and Soft Cosine measured in combination with Genetic Algorithm (GA) and Artificial Neural Network (ANN) to secure Twitter network against spammers. The similarity among tweets is determined using Cosine with Soft Cosine which has been applied on the Twitter dataset. GA has been utilized to enhance training with minimum training error by selecting the best suitable features according to the designed fitness function. The tweets have been classified as spammer and non-spammer based on ANN structure along with the voting rule. The True Positive Rate (TPR), False Positive Rate (FPR) and Classification Accuracy are considered as the evaluation parameter to evaluate the performance of system designed in this research. The simulation results reveals that our proposed model outperform the existing state-of-arts.
-
Almtrafi, Sara Mutlaq;Alkhudadi, Bdour Abduallatif;Sami, Gofran;Alhakami, Wajdi 107
The term Internet of Things (IoTs) refers to the future where things are known daily through the Internet, whether in one way or another, as it is done by the method of collecting various information from various sensors to form a huge network through which people, things and machines are helped to make a link between them at all time and anywhere. The IoTs is everywhere around us such as connected appliances, smart homes security systems and wearable health monitors. However, the question is what if there is a malfunction or outside interference that affects the work of these IoTs based devises? This is the reason of the spread of security causes great concern with the widespread availability of the Internet and Internet devices that are subject to many attacks. Since there aren't many studies that combines requirements, mechanisms, and the attacks of the IoTs, this paper which explores recent published studies between 2017 and 2020 considering different security approaches of protection related to the authentication, integrity, availability and confidentiality Additionally, the paper addresses the different types of attacks in IoTs. We have also addressed the different approaches aim to prevention mechanisms according to several researchers' conclusions and recommendations. -
BODJRE, Aka Hugues Felix;ADEPO, Joel;COULIBALY, Adama;BABRI, Michel 119
Elastic Optical Networks (EONs) allow to solve the high demand for bandwidth due to the increase in the number of internet users and the explosion of multicast applications. To support multicast applications, network operator computes a tree-shaped path, which is a set of optical channels. Generally, the demand for bandwidth on an optical channel is enormous so that, if there is a single fiber failure, it could cause a serious interruption in data transmission and a huge loss of data. To avoid serious interruption in data transmission, the tree-shaped path of a multicast connection may be protected. Several works have been proposed methods to do this. But these works may cause the duplication of some resources after recovery due to a link failure. Therefore, this duplication can lead to inefficient use of network resources. Our work consists to propose a method of protection that eliminates the link that causes duplication so that, the final backup path structure after link failure is a tree. Evaluations and analyses have shown that our method uses less backup resources than methods for protection of a multicast connection. -
Berri, Jawad;Benlamri, Rachid;Atif, Yacine;Khallouki, Hajar 125
The development of systems that can generate automatically instructional material is a challenging goal for the e-learning community. These systems pave the way towards large scale e-learning deployment as they produce instruction on-demand for users requesting to learn about any topic, anywhere and anytime. However, realizing such systems is possible with the availability of vast repositories of web information in different formats that can be searched, reused and integrated into information-rich environments for interactive learning. This paradigm of learning relieves instructors from the tedious authoring task, making them focusing more on the design and quality of instruction. This paper presents a mobile learning system (Mole) that supports the generation of instructional material in M-Learning (Mobile Learning) contexts, by reusing and integrating heterogeneous hypermedia web resources. Mole uses open hypermedia repositories to build a Learning Web and to generate learning objects including various hypermedia resources that are adapted to the user context. Learning is delivered through a nice graphical user interface allowing the user to navigate conveniently while building their own learning path. A test case scenario illustrating Mole is presented along with a system evaluation which shows that in 90% of the cases Mole was able to generate learning objects that are related to the user query. -
Alghamdi, Anwar;Alzahrani, Ahmed;Thayananthan, Vijey 137
The Internet of Things (IoT) paradigm is at the forefront of present and future research activities. The huge amount of sensing data from IoT devices needing to be processed is increasing dramatically in volume, variety, and velocity. In response, cloud computing was involved in handling the challenges of collecting, storing, and processing jobs. The fog computing technology is a model that is used to support cloud computing by implementing pre-processing jobs close to the end-user for realizing low latency, less power consumption in the cloud side, and high scalability. However, it may be that some resources in fog computing networks are not suitable for some kind of jobs, or the number of requests increases outside capacity. So, it is more efficient to decrease sending jobs to the cloud. Hence some other fog resources are idle, and it is better to be federated rather than forwarding them to the cloud server. Obviously, this issue affects the performance of the fog environment when dealing with big data applications or applications that are sensitive to time processing. This research aims to build a fog topology job scheduling (FTJS) to schedule the incoming jobs which are generated from the IoT devices and discover all available fog nodes with their capabilities. Also, the fog topology job placement algorithm is introduced to deploy jobs into appropriate resources in the network effectively. Finally, by comparing our result with the state-of-art first come first serve (FCFS) scheduling technique, the overall execution time is reduced significantly by approximately 20%, the energy consumption in the cloud side is reduced by 18%. -
Bawaneh, Mohammed J.;Al-Shalabi, Emad Fawzi;Al-Hazaimeh, Obaida M. 143
The enormous prevalence of transferring official confidential digital documents via the Internet shows the urgent need to deliver confidential messages to the recipient without letting any unauthorized person to know contents of the secret messages or detect there existence . Several Steganography techniques such as the least significant Bit (LSB), Secure Cover Selection (SCS), Discrete Cosine Transform (DCT) and Palette Based (PB) were applied to prevent any intruder from analyzing and getting the secret transferred message. The utilized steganography methods should defiance the challenges of Steganalysis techniques in term of analysis and detection. This paper presents a novel and robust framework for color image steganography that combines Linear Congruential Generator (LCG), simulated annealing (SA), Cesar cryptography and LSB substitution method in one system in order to reduce the objection of Steganalysis and deliver data securely to their destination. SA with the support of LCG finds out the optimal minimum sniffing path inside a cover color image (RGB) then the confidential message will be encrypt and embedded within the RGB image path as a host medium by using Cesar and LSB procedures. Embedding and extraction processes of secret message require a common knowledge between sender and receiver; that knowledge are represented by SA initialization parameters, LCG seed, Cesar key agreement and secret message length. Steganalysis intruder will not understand or detect the secret message inside the host image without the correct knowledge about the manipulation process. The constructed system satisfies the main requirements of image steganography in term of robustness against confidential message extraction, high quality visual appearance, little mean square error (MSE) and high peak signal noise ratio (PSNR). -
The paper focuses on Device-to-device (D2D) Architectures evaluation frameworks. D2D communication and discovery can improve spectrum usage efficiency and optimize the tradeoffs between throughput and energy consumption. The target operation modes involve both indirect communication between two nodes via a base station or the direct communication among proximal nodes, enabling use cases that can support communications out of cellular coverage, as well as low end-end delay requirements. The paper will present the architectural evolution of D2D networks within 3GPP standardization and will highlight key network functionalities and signaling protocols. It will also identify key analytical and simulation models that can be used to assess the performance and energy efficiency of resource allocation strategies, and it will present a suitable cross-layer integrated framework.
-
The Honeypot is the mechanism that is made to learn more about the attackers like knowing about the method and pattern of attack and is also used to obtain very useful info about all intrusive activities. Honeypots usually categorized according to the interaction's level as (high, medium, low) interaction. The main purpose which is used as honey production and honey research. This paper includes a detailed study of two honeypot tools. The different honey pot findings are put in in this paper to illustrate how honey is working in a real environment and even how it reacts when undesirable interest obtain in this network, and these tools are used to improve the concept of security, protection and confidentiality within or outside the organization to avoid attacks, vulnerabilities and breaches.
-
Almalki, Sarah;Alghamdi, Reham;Sami, Gofran;Alhakami, Wajdi 174
The advent of social media has revolutionized the speed of communication between millions of people around the world in various cultures and disciplines. Social media is the best platform for exchanging opinions and ideas, interacting with other users of similar interests and sharing different types of media and files. With the phenomenal increase in the use of social media platforms, the need to pay attention to protection and security from attacks and misuse has also increased. The present study conducts a comprehensive survey of the latest and most important research studies published from 2018-20 on security and privacy on social media and types of threats and attacks that affect the users. We have also reviewed the recent challenges that affect security features in social media. Furthermore, this research pursuit also presents effective and feasible solutions that address these threats and attacks and cites recommendations to increase security and privacy for the users of social media. -
Mohammed, Asma;Al khathami, Jamilah;Alhakami, Wajdi 184
The amount of information and data in the digital era is increasing tremendously. Continuous online connectivity is generating a massive amount of data that needs to store in computers and be made available as and when required. Cloud computing technology plays a pivotal role in this league. Cloud computing is a term that refers to computer systems, resources and online services that aim to protect and manage data in an effective, more efficient and easy way. Cloud computing is an important standard for maintaining the integrity and security of sensitive data and information for organizations and individuals. Cloud security is one of the most important challenges that the security of the entire cloud system depends on. Thus, the present study reviews the security challenges that exist in cloud computing, including attacks that negatively affect cloud resources. The study also addresses the most serious threats that affect cloud security. We also reviewed several studies, specifically those from 2017-20, that cited effective mechanisms to protect authentication, availability and connection security in the cloud. The present analysis aims to provide solutions to the problems and causes of cloud computing security system violations, which can be used now and developed in the future. -
In the mid of the December 2019, the virus has been started to spread from China namely Corona virus. It causes fatalities globally and WHO has been declared as pandemic in the whole world. There are different methods which can fit such types of values which obtain peak and get flattened by the time. The main aim of the paper is to find the best or nearly appropriate modeling of such data. The three different models has been deployed for the fitting of the data of Coronavirus confirmed patients in Pakistan till the date of 20th November 2020. In this paper, we have conducted analysis based on data obtained from National Institute of Health (NIH) Islamabad and produced a forecast of COVID-19 confirmed cases as well as the number of deaths and recoveries in Pakistan using the Logistic model, Gompertz model and Auto-Regressive Integrated Moving Average Model (ARIMA) model. The fitted models revealed high exponential growth in the number of confirmed cases, deaths and recoveries in Pakistan.
-
Ning, Meng;Wu, Zheru;Zhou, Zhitian;Yang, Duogui 201
The significance of high-quality development and green total factor productivity has attracted widespread attention and research, while few studies on green total factor productivity that considers the use of water resources have been conducted in the context of water shortages and water stress. In this study, the green total factor productivity of water use from 2005 to 2015 in mainland China is evaluated based on the global Malmquist-Luenberger productivity index. Results show that: (1) China's green total factor productivity of water use has been improving since 2005 with an annual global Malmquist-Luenberger productivity index of 1.0104. (2) At the regional level, the eastern zone in mainland China owns the highest green total factor productivity of water use, while that in the intermediate zone ranks last. (3) The green total factor productivity of water use in the southern region (1.0113) significantly higher than that in the northern region (1.0095), and also higher than the national average level in the same period. BPC index has been the most important incluencing factor of green total factor productivity of water use at both national level and regional level since 2011. -
Networks in Mobile ad hoc contain distribution and do not have a predefined structure which practically means that network modes can play the role of being clients or servers. The routing protocols used in mobile Ad-hoc networks (MANETs) are characterized by limited bandwidth, mobility, limited power supply, and routing protocols. Hybrid routing protocols solve the delay problem of reactive routing protocols and the routing overhead of proactive routing protocols. The Ant Colony Optimization (ACO) algorithm is used to solve other real-life problems such as the travelling salesman problem, capacity planning, and the vehicle routing challenge. Bio-inspired methods have probed lethal in helping to solve the problem domains in these networks. Hybrid routing protocols combine the distance vector routing protocol (DVRP) and the link-state routing protocol (LSRP) to solve the routing problem.
-
Phua, Karsten Cheng Kai;Goh, Wei Wei;Marjani, Mohsen 214
As the number of elderly increased rapidly every year, many elderly still choose to stay independent despite of difficulties and challenges faced in their daily routine. Elderly has desperate needs for support for their living. Internet of Things helps to support and improve elderly's life in many ways to meet the needs and requirements. Home Automation Control (EHAC) is a research based system that support elderly with controlled automation solution that control and operates various home electrical appliances based on the measurement of heart pulse rate and environment temperature. This paper works on EHAC system to evaluate the performance of the system in elderly's daily routine. This paper presented experiments conducted with approach of IoT testing and discussion on analysis of the results. -
Human population growth rate is an important parameter for real-world planning. Common approaches rely upon fixed parameters like human population, mortality rate, fertility rate, which is collected historically to determine the region's population growth rate. Literature does not provide a solution for areas with no historical knowledge. In such areas, machine learning can solve the problem, but a multitude of machine learning algorithm makes it difficult to determine the best approach. Further, the missing feature is a common real-world problem. Thus, it is essential to compare and select the machine learning techniques which provide the best and most robust in the presence of missing features. This study compares 17 machine learning techniques (base learners and ensemble learners) performance in predicting the human population growth rate of the country. Among the 17 machine learning techniques, random forest outperformed all the other techniques both in predictive performance and robustness towards missing features. Thus, the study successfully demonstrates and compares machine learning techniques to predict the human population growth rate in settings where historical data and feature information is not available. Further, the study provides the best machine learning algorithm for performing population growth rate prediction.