DOI QR코드

DOI QR Code

Intelligent Shoes for Detecting Blind Falls Using the Internet of Things

  • Ahmad Abusukhon (Computer Science Department., Al-Zaytoonah University of Jordan)
  • Received : 2023.06.28
  • Accepted : 2023.08.22
  • Published : 2023.09.30

Abstract

In our daily lives, we engage in a variety of tasks that rely on our senses, such as seeing. Blindness is the absence of the sense of vision. According to the World Health Organization, 2.2 billion people worldwide suffer from various forms of vision impairment. Unfortunately, blind people face a variety of indoor and outdoor challenges on a daily basis, limiting their mobility and preventing them from engaging in other activities. Blind people are very vulnerable to a variety of hazards, including falls. Various barriers, such as stairs, can cause a fall. The Internet of Things (IoT) is used to track falls and send a warning message to the blind caretakers. One of the gaps in the previous works is that they were unable to differentiate between falls true and false. Treating false falls as true falls results in many false alarms being sent to the blind caretakers and thus, they may reject the IoT system. As a means of bridging this chasm, this paper proposes an intelligent shoe that is able to precisely distinguish between false and true falls based on three sensors, namely, the load scale sensor, the light sensor, and the Flex sensor. The proposed IoT system is tested in an indoor environment for various scenarios of falls using four models of machine learning. The results from our system showed an accuracy of 0.96%. Compared to the state-of-the-art, our system is simpler and more accurate since it avoids sending false alarms to the blind caretakers.

Keywords

1. Introduction

Missing the vision sense is known as blindness. Blindness is defined as “lacking visual perception due to physiological and/or neurological factors” [1]. In one of their report, the World Health Organization (WHO) revealed that there are about 285 million blind people around the world [2]. In another report, the (WHO) revealed that there are 466 million blind and deaf people worldwide. Among these 466 million blind people, there are 34 million kids, and these kids are not given a permanent cure, but they can be aided with technology [3]. Recently, the (WHO) revealed that there are 2.2 billion people suffering from various kinds of vision impairment [4]. Blind or visually impaired people usually rely on other senses such as hearing, smell, and touch in order to carry on with their daily activities and communicate with their environment. Inside their home, for example, they are unable to avoid room barriers such as accommodation and steps. Thus, blind people are extremely exposed to various dangers that may cause them to fall. To detect the fall, the Internet of Things (IoT) technology is used. The IoT, as the name indicates, refers to a collection of things (e.g. sensors, microcontrollers, etc.) that are connected to each other via the Internet in order to perform various tasks. The IEEE [5] and others [6] [7] define the Internet of Things as follows: The Internet of Things is an open and complicated network that connects and manages intelligent objects over the Internet in order to share information, data, and resources. In [8], the authors stated that there are various IoT devices being developed for blind people. Some of these devices have new features, but they are very expensive, and as a result, most blind people are unable to obtain them. In addition, some of these devices are inexpensive, but they do not provide the necessary features. One of the challenges, when designing an IoT system for the blind people, is how to efficiently detect a fall [4]. Misdiagnosis of the fall ends with sending false alarms (e.g. mobile messages or emails) to the blind caretakers, and thus they may no longer trust the IoT system and reject to use it.

The research question of this paper is how to design a simple and accurate IoT system for the blind people that is able to precisely distinguish between true falls and false falls?

To answer the above question, we propose an IoT system for an indoor environment, which is able to precisely detect falls, and distinguish between true falls and false falls. The proposed system is an intelligent shoe (Int-Shoe), which is based on various types of sensors. Besides, the proposed system uses the Firebase platform for data storage, and the NodeMCU ESP8266 Wi-Fi-communication-module (NodeMCU 1.0 (ESP-12E Module) for communicating wirelessly with the Firebase database.

Our aim is to increase the accuracy of detecting falls based on the integration of the functionality of three sensors, namely, the 50Kg load scale sensor, the light sensor, and the Flex sensor. Machine Learning technique is used to measure the accuracy of the proposed system. Our contributions are as follows:

1) Avoid sending false alarms to the caretakers when false falls occur. This merit is not mentioned or discussed in the previous work. One of the gaps in the previous works is that they were unable to distinguish between true falls and false falls. Treating false falls as true falls results in sending false alarms to the caretakers who may reject using the IoT system.

2) Efficiently detect true falls and false falls using a set of sensors including the Flex sensor, the 50Kg load scale sensor and the light sensor. Besides, four machine learning models are used to distinguish between true falls and false falls for various scenarios of falls. The results from our proposed system showed that the Int-Shoe is able to precisely distinguish between true falls and false falls for various scenarios with an accuracy of 0.96%.

3) To the best of our knowledge, no previous work used the 50Kg load scale sensor, the Flex sensor and the light sensor for detecting falls.

4) Novel design: to the best of our knowledge, no previous work proposed designing intelligent shoe for detecting falls. Previous designs include an electronic stick, electronic glasses, an electronic jacket, and electronic bracelets.

Followed by introduction, the reset of this paper is organized as follows: Section 2, describes the related work. Section 3, describes the proposed system, the research methodology, motivation, and hardware and software description. Section 4, describes the system evaluation using Machine Learning. Section 5, describes the conclusion.

2. Related Work

In this section, we describe various techniques for preventing and detecting falls.

2.1 The IoT Techniques for Preventing Falls

In [9], the authors developed a network of ultrasonic sensors and voice guidance signal system in order to detect the direction and the position of obstacles that obstruct the path of a blind person. In [10], the authors proposed wearable smart glasses and an intelligent walking stick for the blind people. In a similar work [11], the authors developed a wearable vision assistance system that detects various objects based on binocular vision sensor, which is used to capture images in a specific frequency. In [12], the authors proposed a wearable glasses design and a walking assistant design based on the ultrasonic sensors. In [13], the authors had replaced the blind’s cane with an IoT prototype in order to reduce the total cost. The proposed prototype is able to detect obstacles (using an ultrasonic sensor that is affixed to the blind glasses) and warn the blind people about these obstacles via smart phone. In [14], the authors investigated tracking the blind way using TRACE engine. In [15], the authors innovated a blind stand that is provided by an Arduino Uno Microcontroller (AUM) and ultrasonic sensors associated with a voice module in order to guide the blind people.

The authors in [14][16] and [17]-[28] proposed IoT systems which are similar to the IoT systems proposed in [15]. In [17], the authors proposed an IoT system called the "Blind’s Mate". Their system is composed of a smart stick and a mobile application. Audio messages about obstacles are sent via an earphone. The stick, which is used for detecting ditches, is connected to a mobile application via a Wi-Fi module, and it vibrates when an obstacle is detected. In [18], the authors proposed an IoT system, which uses the AUM, the Global Position System – Global System for Mobile (GPS-GSM) module, and the ultrasonic sensors in order to help the blind people to avoid obstacles. In [19][22][24], the authors proposed an IoT stick that is able to detect obstacles based on an ultrasonic sensor, a vibrator motor and or a buzzer. When a blind person faces an obstacle, the stick vibrates as an indicator. In [20], the authors proposed a smart cane that uses an ultrasonic sensor, a camera, an InfraRed (IR) sensor, and a GPS-GSM module. Their system detects obstacles, heat, water, light, stairs, and traffic signals. To achieve safety for the blind people, the authors in [29][28][30] proposed a navigation system based on image processing techniques. However, these systems are inefficient since cameras are sensitive to various environmental distortions such as dust, smoke, fog, and vapor, and they do not work perfectly at night. In [31], the author developed a barcode-based navigation system to aid the blind people in finding their way. In [32], the authors proposed exploring Augmented Reality (AR) visualization for People with Low Vision (PLV) in order to facilitate stair navigation.

The above survey shows that most of the recent papers are focusing on improving the design and features of the smart stick using sensors and cameras. However, other researchers are focusing on using artificial intelligence with IoT in order to provide solutions for the blind's problems and difficulties.

2.2 IoT Techniques for Detecting Falls

In [25], the authors classified and recognized five states of the blind position, namely, the run state, the walk state, the stand state, the lie state, the sit state, and the fall state, based on two 6-axis sensors that are attached to the smart glasses and the smart stick. Although the authors use complicated mathematical equations to recognize the fall behavior (the above five states), unfortunately, their system is inaccurate. This is because their system recognizes falls based on the state (the position) of the smart glasses and the smart stick, but not the state of the blind person himself. For example, if the blind person's smart glasses or stick, or both, fall to the ground in an accident while they are still standing, their system deems it a fall, but in fact it is not. This ends with inaccurate information being sent to the blind caretakers, telling them that the blind person has fallen, but this is not true (it is a false alarm). If false alarms are sent frequently to the blind caretakers, then they may no longer trust the IoT system, and thus reject using it. The main contribution of this paper is a proposed IoT system that can precisely determine if a fall is a true fall or a false fall, thus avoid sending false alarms to the blind caretakers. In this paper, we precisely recognize a true fall and a false fall using three sensors, namely, the Flex sensor, the 50Kg load scale sensor, and the light sensor, as described in Section 3. In [8], the authors designed a walking stick for the blind people. The stick is enhanced by the ultrasonic sensors (for detecting obstacles), the water level sensor (for detecting puddles), and a GPS module for detecting the location of a blind person and in the event of an emergency, it sends SMS messages to those who need to know. In [21], the authors proposed an IoT system with the following objectives: develop a stick with a camera in order to identify objects, send the location of a blind person to the caretaker using a GPS; and get the destination route using Google Map. A new feature was added in this work, namely, night vision. In [33][23], the authors proposed an IoT system that uses the AUM, the ultrasonic sensor, and the moisture sensor for detecting water on the surface. Their system sends voice messages to blind people in order to guide them along their path, and it also sends the location of a blind person to some relevant party if she/he presses a button on the stick. In [34], the authors developed an IoT system for the blind people, which is able to detect obstacles, sudden falls, and moving objects. Their system consists of ultrasonic sensors (for detecting obstacles), a 3-axis analog accelerator (for detecting falls), a microcontroller, and a PIR motion sensor. Their system detects falls as follows: the y-axis of the accelerator sensor is always vertical when a blind person is standing. The output value of the accelerator decreases as the angle between the y-axis of the accelerator and the vertical axis is increased. In [35], the authors proposed an IoT system for the blind people. Their system uses three types of sensors, namely, the ultrasonic sensor, the accelerometer sensor (for detecting falls), and the voice recognition sensor (used for detecting the blind person’s voice when she/he asks for help). In [36], the authors proposed an IoT system that uses a set of ultrasonic sensors that are affixed to the user’s jacket in order to detect obstacles and an accelerometer sensor for detecting falls. However, the authors did not mention details about the accelerometer sensor, how it works, or where it should be placed on the jacket. In [37] the author proposed IoT bracelets for guiding blind people along their paths. Besides, he developed an algorithm for detecting falls using ultrasonic sensors that are affixed to the IoT bracelets. A fall is recorded when the value of an ultrasonic sensor that is affixed to the left or right bracelet is greater than 2000 (i.e., when one of the bracelets touches the ground). In [38] the authors proposed detecting falls based on a human skeleton diagram. However, this technique is used for an indoor environment but not for an outdoor environment. This is because a camera must be used to capture video of the user, and then this video is analyzed in order to detect falls. As we mentioned earlier in this paper, this technique is inefficient since cameras are sensitive to various environmental distortions such as dust, smoke, fog, and vapor, and they do not work perfectly at night.

3. The Proposed IoT System

This section describes the research methodology, motivation, hardware description and connection, software and algorithm description of the proposed IoT system.

3.1 The Research Methodology

The proposed IoT system is an intelligent shoe for detecting falls. It consists of the following modules: the data gathering module (a set of sensors for collecting data), the data storage module (the Firebase database), and the communication module (the NodeMCU 1.0 (ESP-12E Module), and the data retrieving and processing module (an android application). In the proposed system, a set of sensors, namely, the Flex sensor, the light sensor, the 50Kg load scale sensor, and the ultrasonic sensor are used for collecting data. The accuracy and the thresholds of these sensors are measured before testing the Int-Shoe. The data collected by these sensors are written to a real-time database (the Firebase database) with the help of the NodeMCU 1.0 (ESP-12E Module). These data are then read and processed by an android application that integrates the functionality of the above sensors in order to precisely detect falls. The proposed intelligent shoe is tested and evaluated in an indoor environment for two cases, namely, the true and false falls, where various audio messages are sent to the blind caretakers via their mobiles. Additionally, Machine Learning is utilized to assess the proposed system's accuracy using a variety of Machine Learning models.

3.2 The Research Motivation

In this section, we describe the motivation for this research paper. The earlier studies were unable to distinguish the false falls from the true falls. False fall refers to treating the fall of a wearable device such as a smart stick and/or a smart glasses ([25] and [34]) as if it is the fall of a blind person even if he does not wear the device. True fall refers to the fall of a blind person itself not the wearable device. False falls end with false alarms being sent to the caretakers. To tackle this problem, we propose an efficient and accurate IoT system for distinguishing between true falls and false falls based on three sensors, namely, the Flex sensor, the light sensor and the 50Kg load scale sensor. The proposed IoT system differs from other works [8][10][25][34][35] as follows:

1) The proposed system is unique in its design. Our design is an intelligent shoe, while other designs are smart glasses, smart sticks [25][10] or a wearable device attached to the user’s shin [34].

2) Unlike the previous works [10][25][34][35], the proposed system efficiently distinguishes false falls from true falls. The works in [25][34][35] considered the fall of a wearable device carried by the blind person as a fall of the blind person, and thus they sent false alarms to the blind caretakers telling them that a fall occurs, which is not accurate. For example in [34], if the 3-axis analog accelerator is rotated by 00, 300 or 900 angle by accident, then their system considers this situation a true fall, but in fact it is not. However, our proposed system closes this gap in the literature. Moreover, our proposed system is simpler than [25] since it does not use complicated mathematical equations as they did. Since our proposed system is a real-time system, using complicated mathematical equations will slow down this system. Besides, our work varies from the previous work [8] as follows: in [8], the user proposed shaking the smart phone in order to send a message to the caretakers if a fall occurs. However, in this paper the proposed system automatically detects falls and sends a message to the caretakers telling them about falls. The proposed system uses three sensors for distinguishing between the true and the false falls, namely, the flex sensor, the 50Kg load scale sensor and the light sensor. To the best of our knowledge, these sensors were not used in the previous work for detecting or distinguishing between the true and the false falls. Thus, our technique is novel.

3.3 The Hardware Description and Connection

In the proposed system, a set of sensors are used as described in Fig. 1.

E1KOBZ_2023_v17n9_2377_f0001.png 이미지

Fig. 1. The hardware used in the proposed system.

The hardware used in the prototype system is as follows (ordered from 1-12 as described in Fig. 1): a 50Kg load scale sensor, an ultrasonic sensor, a Flex sensor, a temperature sensor, a light sensor, a battery connector, a nine volt-battery, a HW-131 power supply module, a NodeMCU module, and a USB cable. The NodeMCU is an open source platform which is based on the ESP8266. The NodeMCU connects objects using the Wi-Fi protocol. The NodeMCU is used as a microcontroller that provides features such as the GPIO and PWM pins. In fact, it contains 13 GPIO pins and 10 PWM channels. Besides, it can be programmed with the Arduino IDE, and it is available as an access point [39].

The 50Kg load scale sensor is enhanced by a Load Cell Amplifier called the HX711 module. The HX711 is a 24-bits analog-to-digital converter that converts the small changes in strain from the 50Kg load scale sensor into 24-bits changes in voltage [40].

The ultrasonic sensor consists of a transmitter and a receiver. The transmitter emits ultrasonic waves that reflect off an obstacle, and then it is captured by the receiver. The distance (L) between the sensor and an obstacle is calculated as in (1).

\(\begin{aligned}L=\left(\frac{1}{2}\right) T . C\end{aligned}\)       (1)

Where, T is the time between the emission and the reception, and C is the sonic speed. To calculate the distance (L) between an ultrasonic sensor and an obstacle, the speed of the sound is converted from meter per second (m/s) into centimeter per microsecond (cm/µs). Thus, in the NodeMCU code, we convert the speed from 340 m/s to 0.0340 cm/µs, and then, (L) is calculated using (1) as follows:

T= PulseIn (echoPin1, HIGH);

L= 0.0340*(T/2);

The temperature sensor can sense and measure the temperature of the environment around the blind’s shoe. It is affixed to the front of the blind’s shoe. When the temperature exceeds 40° Celsius, the blind person receives an audio message as a warning.

The Flex sensor produces various values when it is bent at various angles. The Flex sensor has two pins. We connect one of these pins to the ground pin (the G pin on the NodeMCU board), and the other pin of the Flex sensor is connected to a 10K Ohm resistance. Also, the 10K Ohm resistance is connected to a 3 volt pin (the 3v pin on the NodeMCU board) from one side, and it is connected to the analog pin A0 on the NodeMCU board from the other side.

The above sensors are tested while they are connected to the NodeMCU module before affixing them to the blind’s shoes as described in Fig. 2.

E1KOBZ_2023_v17n9_2377_f0002.png 이미지

Fig. 2. Testing the sensors before affixing them to the intelligent shoes

Besides, Fig. 3 describes various values when the Flex sensor is bent at various angles while it is connected to the NodeMCU board.

E1KOBZ_2023_v17n9_2377_f0003.png 이미지

Fig. 3. Various values recorded when the Flex sensor is bent at various angles.

Unfortunately, the NodeMCU board has only one analog pin, the "A0" pin. Connecting these sensors with the NodeMCU board requires more than one analog pin. Thus, in this paper, we use the 16-ch-analog-multiplexer (16-ch-Mul) to connect these sensors to the NodeMCU, as illustrated in Fig. 4-(C). In addition, in order to facilitate this task, we use the NodeMCU base board. The NodeMCU base board, as shown in Fig. 4-(A), serves as a foundation for the NodeMCU. We connect the 16-ch-Mul with the NodeMCU as follows : the Z, S0, S1, S2, and S3 pins on the 16-ch-Mul are connected to the pins A0, D0, D1, D2, and D3 on the NodeMCU base board, respectively. Besides, the "Eng", the "VCC", and the "GND" pins on the 16-ch-Mul are connected to the "G", the "5V", and the "GND" pins on the NodeMCU base, respectively.

E1KOBZ_2023_v17n9_2377_f0018.png 이미지

Fig. 4. (A) The NodeMCU board and the NodeMCU Base board. (B) The HX711 module. (C) The 16-channel-Multiplexer.

Fig. 5, describes the Int-Shoe before packaging.

E1KOBZ_2023_v17n9_2377_f0004.png 이미지

Fig. 5. The proposed Int-Shoe before packaging.

Fig. 6 describes the Int-Shoe after packaging. Note that the NodeMCU module is connected to a nine-volte battery as described in Fig. 6-(B).

E1KOBZ_2023_v17n9_2377_f0019.png 이미지

Fig. 6. (A) The proposed Int-Shoe after packaging –top view. (B) The Int-Shoe- left-side view. (C) The Int-Shoe- right-side view

Fig. 7, describes the Int-Shoe- bottom view, where the 50Kg load scale sensor is affixed.

E1KOBZ_2023_v17n9_2377_f0005.png 이미지

Fig. 7. The Int-Shoe- bottom view

3.4 Fall Detection- Integrating the Sensor's Functionalities

This section describes how the sensor’s functionalities are integrated in order to detect falls.

1) The light sensor is affixed inside the shoe. This sensor is affected by the amount of light. It is utilized as a criterion for deciding whether the blind person is wearing or not wearing his shoes. This decision is important in order to precisely detect true falls. The Light Sensor (LS) value is ranged from 0 to 1023. In this paper, the author sets the light sensor’s threshold to 405.18. This value is the average of 50 values recorded by the light sensor when the blind person takes off his shoes (as described later in Section 4.1). Thus, if the LS value is less than or equal to 405.18, then this means that the light sensor does not receive light. This case happens when the blind person wears her/his shoes, and this confirms true fall when falls occur. Otherwise, if the LS value is greater than 405.18, then this means that the blind person is not wearing her/his shoes, and this confirms false fall.

2) The 50Kg load scale sensor is affixed to the bottom of the shoe’s heel in order to measure the load on that shoe. It is utilized as a criterion for deciding whether the blind person is wearing or not wearing his shoes. For example, suppose that the blind’s weight is 45Kg, then if the sensor value is 45Kg, then we conclude that the blind person is wearing his shoes and he is still standing up (i.e. no falls occur). Otherwise, if for example the sensor value is Zero, then we conclude that the blind person is not wearing his shoes or a fall occurs. This decision is important in order to precisely confirm true falls.

3) The Flex sensor is affixed to the right-side of the right shoe and to the left-side of the left shoe. When a fall occurs, the Flex sensor is bent, and thus, the sensor value confirms true falls.

4) The ultrasonic sensor is affixed to the front of the blind’s shoe. Its main role is to discover all obstacles in front of the blind person. If the distance between the blind person and an obstacle is less than or equal to 30cm, then the blind person receives an audio message. Note that this sensor is able to discover the obstacles near the ground surface.

Note that the above sensors (1-3) cannot decide whether a fall is a true or false fall unless their functionality is integrated. For example, we cannot conclude that a fall is a true fall if the Flex sensor is bent without checking the other sensors values. The following are examples on true falls and the false falls. If the Flex sensor is bent and the value of the 50Kg load scale sensor is greater than 20Kg, and the value of the light sensor is less than or equal 405.18, then this type of fall is a true fall. This is because 1) The Flex sensor is bent 2) The blind person is wearing her/his shoes. However, if the Flex sensor is bent, and the value of the 50Kg load scale sensor is ≈ 0, and the value of the light sensor is greater than 405.18, then this type of falls is a false fall. To the best of our knowledge, no previous work proposed intelligent shoes, which integrate the functionality of three sensors, namely, the Flex sensor, the light sensor, and the 50Kg load scale sensor, in order to distinguish between false falls and true falls.

3.5 Software and Algorithm Description of the Proposed System

In this section, the author describes the algorithm of the proposed IoT system developed in order to enhance the visually impaired people. Table 1 lists the shortcuts and the sensors thresholds used to describe this algorithm.

Table 1. The sensors’ shortcuts and thresholds

E1KOBZ_2023_v17n9_2377_t0001.png 이미지

In Table 1, we determine the thresholds of the sensors affixed to the Int-Shoe as follows: the ultrasonic sensor threshold is 30cm. Our Android application sends an audio message to the blind person if the distance between the ultrasonic sensor and an obstacle is less than 30cm. The light sensor threshold is 405.18. If the value of the light sensor retrieved from the Firebase database is less than or equal to 405.18, then the Android application recognizes that the blind person wears her or his shoes. The temperature sensor threshold is 40℃. If the temperature is greater than 40℃, then the blind person receives an audio message asking him to move one step backward. The threshold of the Flex sensor is 957 as described in Fig. 8. Fig. 8, shows the initial state of the Flex sensor. The initial state of the Flex sensor is the state where the Flex sensor is bent to fit in its container (the pink container in Fig. 8). In this state, the value of the Flex sensor is 957 as described in Fig. 8. Note that if a fall occurs, then this value will increase since the Flex sensor will be exposed to a pressure caused by the weight of the blind’s foot when she/he is laying on the ground. This way, the proposed system detects falls.

E1KOBZ_2023_v17n9_2377_f0006.png 이미지

Fig. 8. The initial state of the Flex sensor

Fig. 9, describes the pseudo code of the proposed IoT system. In the bootstrap phase, the blind’s weight is measured in order to determine the threshold of the 50Kg load scale sensor. In this paper, the author set the maximum threshold value of the 50Kg load scale sensor (MaxLCTH) = 40Kg, and the minimum threshold value of the 50Kg load scale sensor (MinLCTH) ≈ 0Kg. Besides, in this phase, the thresholds of the other sensors are also determined. The Int-Shoe is able to discover various obstacles in front of the blind person using an ultrasonic sensor, which is affixed in front of the blind’s shoe. If an obstacle is detected (Note that the threshold of the ultrasonic sensor USSTH = 30cm), then the blind person receives an audio message asking him to move backward one step; otherwise, he takes a stride forward. Besides, the Int-Shoe checks if there is a fire along the blind person’s path. For example, if the value of the temperature sensor is less than the TSTH value, then the blind person receives an audio message asking him to move a stride forward. Fig. 9, describes the navigation algorithm using the ultrasonic sensors. The above thresholds are determined as follows: the ultrasonic sensor threshold is chosen to be 30cm since we measured the step distance of the person how tested the system, and we found that the distance is 30cm. Thus, the system warns the blind person before he takes a stride forward. The 50Kg_load_scale_sensor’s maximum threshold is chosen to be 40Kg since we measured the weight of the person who tested the system, and we found that his weight is about 40Kg. However, this value can be changed with respect to the weight of the blind person (there are various weights for various individuals). In addition, the threshold zero is chosen as the minimum threshold for the 50Kg load scale sensor since when a fall occurs the pressure on the shoes equals zero. The threshold of the Flex sensor is determined based on the state of this sensor when it is affixed to its base (the pink box) as described in Fig. 8. The threshold of the Flex sensor can be changed to another value, but it should be greater than 0. The principle here is that we bend the Flex sensor to a certain level with respect to its holder (the pink box) where the value of the sensor is greater than 0, then we measure the value of this sensor at that position and consider it a threshold. After that, we monitor any changes in the value of the Flex sensor. If this value is greater than the threshold, then this is an indicator that a fall occurs. Besides, in Fig. 9, the algorithm describes the fall detection process. First, the proposed system gathers the required information from the sensors affixed to the Int-Shoe. It reads the LS, LC and the FS values, and then it writes them to the Firebase Database on the cloud. Then, our Android application reads these values from the Firebase Database, and then, it checks if ((FS > FSTH) && (LC == 0) && (LS < LSTH) condition is held. If yes, then this indicates that there is a true fall. This is because the weight of the blind person is zero (i.e. LC == 0) while he is wearing his shoes (i.e. LS < LSTH) and the FS is exposed to a pressure (i.e. the Flex sensor is bent).

E1KOBZ_2023_v17n9_2377_f0007.png 이미지

Fig. 9. The pseudo code for the Int-Shoe- navigation and fall detection

Fig. 10, describes a sketch of the Firebase database’s structure.

E1KOBZ_2023_v17n9_2377_f0008.png 이미지

Fig. 10. The Firebase database’s structure

As described in Fig. 10, the Firebase database consists of a root node called “LeftShoes”, and five sub-nodes. The sub-node “LS-FOD” is a shortcut for “Left Shoe Forward Obstacle Detector”. It stores the value read by the ultrasonic sensor that is affixed on the front of the Int-Shoe. The sub-node “LS-Flex” is a shortcut for “Left Shoe Flex sensor” and it stores the value read by the Flex sensor. The sub-node “LS-Light” is a shortcut for “Left Shoe Light sensor” and it stores the value read by the light sensor. The sub-node “LS-Scale” is a shortcut for “Left Shoe load Scale sensor” and it stores the value read by the 50Kg load scale sensor. The sub-node “LS-Temp” is a shortcut for “Left Shoe Temperature sensor”, and it stores the value read by the temperature sensor that is affixed to the left shoe.

4. System Evaluation

In this section, we describe the evaluation of the proposed system. In section 4.1, we evaluate the sensors we use. In section 4.2, we evaluate the proposed system when all sensors are integrated together. The evaluation is carried out using Machine Learning.

4.1 Sensors Evaluation

We evaluate the sensors we use in our experiments by calculating the Average Error Rate (AER) and the Accuracy (ACC) of the sensors as described in (2) and (3).

\(\begin{aligned}A E R=\frac{\mid \text { Measured Distance-Actual Distance } \mid}{\text { Actual Distance }} X 100\end{aligned}\)       (2)

ACC = 100 − AER       (3)

Table 2 describes the AER and the ACC values of the ultrasonic sensor when the actual distance between an obstacle and an ultrasonic sensor is 5, 10, 15, 20, 25, 30, 35, 40, 45, and 50cm. In Table 2, each value in the "AER" and the “ACC” columns is the average and or the accuracy of 50 trials as described in Fig. 11- (A) and (B).

Table 2. The accuracy and the average error rate of the ultrasonic sensor after 50 trails for each distance

E1KOBZ_2023_v17n9_2377_t0002.png 이미지

E1KOBZ_2023_v17n9_2377_f0020.png 이미지

Fig. 11. (A) The distance measured by an ultrasonic sensor (B) The accuracy of ultrasonic sensor

Besides, we calculate the AER and the ACC for the load scale sensor. Table 3, describes the AER and the ACC.

Table 3. The accuracy and the average error rate of the load scale sensor after 10 trails for each weight

E1KOBZ_2023_v17n9_2377_t0003.png 이미지

Fig. 12-A describes the weights measured by the load scale sensor while Fig. 12 B- describes the accuracy of this sensor. Note that the accuracy in this figure approaches 1, which means (100%).

E1KOBZ_2023_v17n9_2377_f0021.png 이미지

Fig. 12. (A) The weights measured by the load scale sensor (B) The accuracy of the load scale sensor

In addition, we measure the amount of light captured by the light sensor when the blind person wears his shoes and when he takes them off in an indoor environment (i.e. inside the room). The average of 50 values recorded by the light sensor when a blind person takes off his shoes is 405.18. Besides, the average of 50 values recorded by the light sensor when a blind person wears his shoes is 9. Therefore, in this paper, we assume that the shoes are taken off if the value of the light sensor is below 405.18 (since the average is 405.18). Fig. 13 describes the threshold line of the light sensor.

E1KOBZ_2023_v17n9_2377_f0009.png 이미지

Fig. 13. The values measured by the light sensor when a blind person wears and takes off her or his shoes.

In this paper, if the light sensor value is ≥ 405.18, this means that the blind person takes off his shoes. Besides, if the light sensor value is ≤ 9, this means that the blind person wears his shoes. In addition, we evaluate the Flex sensor for various positions. As we mentioned earlier in section 3.5, we affix the Flex sensor to its base (the blink plastic base) and the value of the Flex sensor at that position is 957. We test the Int-Shoe when a fall occurs for 50 trails. For each trail the Flex sensor is bent at a specific angle, and the average of 50 trails recorded by the Flex sensor, is 973.6. This is an indicator that when the Flex sensor value is greater than 957 then a fall occurs as described in Fig. 14. Besides, these sensors are tested later in this section when they are integrated together in order to determine the type of fall. Various values of these sensors are recorded and the machine learning is used to determine if a fall is true or false.

E1KOBZ_2023_v17n9_2377_f0010.png 이미지

Fig. 14. The values measured by the Flex sensor for fall and non-fall cases

4.2 System Evaluation

In this section, we show that the proposed IoT system efficiently detects true and false fall using three sensors, namely, the light sensor, the 50Kg load scale sensor, and the Flex sensor. Machine learning techniques are also used in the evaluation process. Fig. 15, describes the Firebase database when the proposed system is tested for true and false falls. Fig. 15-(A), describes a false fall. The Flex sensor value is greater than the Flex sensor threshold (FSTH), and this indicates that a fall occurs since the Flex sensor is bent.

E1KOBZ_2023_v17n9_2377_f0011.png 이미지

Fig. 15. The Firebase database when (A) a false fall is recorded (B) a true fall is recorded.

Besides, the value of the light sensor is less than the light sensor threshold (LSTH), and this is an indicator that the blind person wears his shoes. However, this case is considered as a false fall, and no message is sent to the blind caretakers. This is because LS-Scale =40Kg, which means that the blind person is still standing. Fig. 15-(B), describes a true fall. The value of the Flex sensor is greater than the Flex sensor threshold (FSTH), and this indicates that a fall occurs since the Flex sensor is bent. Besides, the value of the light sensor is less than the light sensor threshold (LSTH), and this is an indicator that the blind person wears his shoes. In addition, the value of the 50Kg load scale sensor (LS-Scale value) is Zero, and this is an indicator that the blind person exerts no pressure on the shoes, and thus a true fall occurs. The sensors values appear in Fig. 15-(B) are saved in the Firebase database after a fall occurs as described in Fig. 16.

E1KOBZ_2023_v17n9_2377_f0012.png 이미지

Fig. 16. A True fall

Fig. 17, describes our android application, which is developed in order to enhance the blind people. The values displayed on the tablet in Fig. 17 are the sensors values that are recorded when a true fall occurs as described in Fig. 15-(B). In this case, the proposed IoT system sends a text message to the blind caretakers informing them about falls.

E1KOBZ_2023_v17n9_2377_f0013.png 이미지

Fig. 17. The sensors values as they appear on the blind’s Tablet when a true fall occurs.

Table 4 describes the evaluation of various scenarios of falls, which are tested by the Int-Shoe.

Table 4. Various cases of falls recorded by the Int-Shoe

E1KOBZ_2023_v17n9_2377_t0004.png 이미지

In Table 4, there are eight cases (C1-C8) detected by the proposed system and they considered false falls. The explanations for these cases are described in Table 4. However, the proposed system detects one case (case 9) and considered it a true fall. Besides, the machine learning techniques are used for evaluating the proposed system. To do so, a sample of 400 cases (recorded by the Int-Shoe) of false and true falls is used. Each entry in the sample consists of three sensors values: LS, LC, and FS (see Table 1). Each entry is classified into true or false fall. The sample is divided into two sets, namely, the training set and the testing set (the rate of the training set is 80% of the total sample, and the rate of the testing set is 20% of the total sample). In the experiment, four classification models are used, namely, the Logistic Regression (LR), K-Nearest Neighbors (KNN), Support Vector Machine (SVM) and the Random Forest (RF) model. Table 5 describes a comparison of the above models based on accuracy.

Table 5. The evaluation of LS, LC, and FS sensors for various scenarios of falls tested by various machine learning models.

E1KOBZ_2023_v17n9_2377_t0005.png 이미지

As described in Table 5, the accuracy is very high (e.g. F1 = 0.96 in Table 5) for all models. Fig. 18, describes the above results.

E1KOBZ_2023_v17n9_2377_f0014.png 이미지

Fig. 18. A comparison of four classification models for values recorded by three sensors, namely, LS, LC, and FS based on accuracy.

The results show that the proposed system (Int-Shoe) efficiently distinguishes between true and false fall. Alternatively, and in order to analyze the individual walking and falling actions, the above-collected data (as described in Table 4) can be stored in the Cloud Storage instead of the Firebase Database. Once these data are in the Cloud Storage, they can be quickly connected to Google Cloud's powerful tools to construct a data warehouse with “BigQuery”, conduct open-source analytics with “Dataproc”, or build and deploy Machine Learning (ML) models using Vertex AI [41]. The Int-Shoe may also be enhanced with multiple Flex sensors that are affixed to various positions of the Int-Shoe as described in Fig. 19. There are some cases in which the Flex sensor may accidentally bend when the blind’s foot is lifted off the ground while walking as described in Fig. 20. In this case, and in order to avoid false alarms, an ultrasonic sensor (RFS) is affixed to the bottom of the right shoe and another ultrasonic sensor (LFS) is affixed to the bottom of the left shoe. While walking one of the shoes touches the ground while the other is lifted off the ground. When a shoe touches the ground, then the value of the ultrasonic sensor is greater than or equal 2000. However, when a shoe lifted off the ground then the value of the ultrasonic sensor is in the range of 15-17cm.

E1KOBZ_2023_v17n9_2377_f0015.png 이미지

Fig. 19. An Int-Shoe with multiple Flex sensors affixed to various positions.

E1KOBZ_2023_v17n9_2377_f0016.png 이미지

Fig. 20. A Flex sensor may accidentally bend- the blind’s foot is lifted off the ground

Now, if the Flex sensor is bent accidentally when the blind’s foot is lifted off the ground, then the proposed system checks the value of both sensors, namely, the RFS and the LFS sensors and make a decision as follows: if for example, the LFS value is greater than or equals 2000 cm, and the RFS is in the range of 15-17 cm, then this case is false fall and vice versa. However, if both values (the RFS and the LFS values) are less than 2000, then this is an indicator that a true fall occurs since the blind’s feet are lifted off the ground. Another case of fall that is interesting to study is when the feet on the ground after the action of falling. Fig. 21, describes this case of true fall.

E1KOBZ_2023_v17n9_2377_f0017.png 이미지

Fig. 21. A true fall- feet on the ground after the action of falling.

In this case, the light sensor value ≤ 9, the Flex sensor equals its threshold =957, and the load scale sensor value is less than 50 and greater than 0. In this case, a fall occurs. However, this is a special case and it can be treated as follows: since there are two ultrasonic sensors (RFS and LFS) that are affixed to the shoe’s bottom, then in this case, both ultrasonic sensors touch the ground surface and thus, the value of both sensors is greater than 2000. The proposed system treats this case as follows: if ((FS == FSTH) && (LC > 0) && (LS < LSTH) && (RFS ≥ 2000) && (LFS ≥ 2000) then this case is considered a true fall. In addition, when a true fall occurs, we may include the user’s location in the message sent to the caretakers in order to facilitate finding his position. This is possible by using the "LILYGO®TTGO SIM7600E-H module ESP32-WROVER-B chip Wi-Fi Bluetooth 18650 battery holder solar charger development board". This board provides the following:

1) A microcontroller (the TTGO ESP8266 from LilyGO is a microcontroller based on an ESP8266 (NodeMCU)) [42].

2) Data collected by the Global Positioning System (GPS), such as altitude and longitude.

3) SMS messages that are sent over the Internet.

We need to perform the following steps before using the above board:

1) Insert a Nano SIM card into the micro SD card slot of the LILYGO board (this step is necessary for sending SMS messages).

2) Connect the antennas (the LTE and GPS antennas) to the LILYGO board. In addition, using the above board requires installing and uploading some libraries into Arduino code. These libraries include the TinyGSM Library and StreamDebugger [43].

5. Conclusion

One of the most important issues to be taken into consideration when designing an IoT system for blind people is how to precisely detect true falls. When a fall occurs, a message should be sent to the caretakers. One of the gaps in the previous works is that they were unable to distinguish between true falls and false falls. In other words, they did not distinguish between the falls of a wearable device and the falls of the blind person. Therefore, in the previous systems, the blind caretakers may receive false alarms because in their systems the fall of a wearable device is considered as a true fall. Sending false alarms to the caretakers makes them unhappy with the IoT system, and thus, they may reject it. As an attempt to close the above gap, this paper proposed an intelligent shoe (Int-Shoe) that is able to precisely distinguish between the false falls and the true falls, and thus, the proposed system avoids sending false alarms to the caretakers. The design of the Int-Shoe is based on three sensors, namely, the 50Kg load scale sensor, the light sensor, and the Flex sensor. These sensors were tested and validated and their accuracy and thresholds were measured. Besides, the proposed Int-Shoe was evaluated in an indoor environment using four machine learning models and the results showed that the Int-Shoe is able to precisely distinguish between true falls and false falls with accuracy of 0.96%. The NodeMCU module that uses the Wi-Fi protocol, and the Firebase database were used in the proposed IoT system. To the best of our knowledge, no previous work proposed the intelligent shoe that integrates the functionality of three sensors, namely, the Flex sensor, the light sensor, and the 50Kg load scale sensor in order to distinguish between false falls and true falls. In future, we propose to develop the proposed system to be able to work in an outdoor environment by using the Global System for Mobile Communications (GSM). To achieve this goal, we may use the “LILYGO®TTGO SIM7600E-H module ESP32-WROVER-B chip Wi-Fi Bluetooth 18650 battery holder solar charger development board”. Besides, artificial intelligence can be used for detecting other objects such as traffic lights as described in [44]. This is done in order to protect the blind person from various hazards.

Acknowledgement

The author would like to thank Al-Zaytoonah University of Jordan for their encouragement and for their financial support.

References

  1. BrailleWorks "Famous People with Visual Impairments," North America, Accessed on. 30-01-2022. [Online] Available: https://brailleworks.com/braille-resources/famous-people-with-visualimpairments/
  2. M. Rajeswari, G.G. Lakshmi, B.S. Rubavathy, and G. Sharmila, "Developing an assistive aid for blind people using IOT," International Journal of Innovative Research in Science, Engineering and Technology IJRASET, vol. 7, no. 2, pp. 124-131, 2018.
  3. K. Vasanth, M. Macharla and R.A. Varatharajan, "A Self Assistive Device for Deaf & Blind People Using IOT," J Med Syst. vol. 43, no. 88. pp 1-8, 2019. . https://doi.org/10.1007/s10916-018-1115-2
  4. K. Bineeth, S. Raju and E.S. Frode, "Tools and Technologies for Blind and Visually Impaired Navigation Support: A Review," IETE Technical Review, vol. 39, no. 1, pp. 3-18, 2022. https://doi.org/10.1080/02564602.2020.1819893
  5. X. Liu and O. Baiocchi, "A comparison of the definitions for smart sensors, smart objects and Things in IoT," in Proc. of 2016 IEEE 7th Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, pp. 1-4, Oct 2016.
  6. S. Madakam, R. Ramaswamy, S. Tripathi, "Internet of Things (IoT): A literature review," J. Computer and Communication, vol. 3, no. 5, pp. 164-173, 2015. https://doi.org/10.4236/jcc.2015.35021
  7. S. Chen, H. Xu, D. Liu, B. Hu and H. Wang, "A vision of IoT: applications, challenges, and opportunities with china perspective," IEEE Internet of Things Journal (IoT-J), vol. 1, no. 4, pp. 349-359, 2014. https://doi.org/10.1109/JIOT.2014.2337336
  8. N. Sahoo, H.W. Lin, and Y.H, Chang, "Design and Implementation of a Walking Stick Aid for Visually Challenged People," Sensors (Basel, Switzerland), vol. 19, no. 1, pp. 1-17, 2019.  https://doi.org/10.3390/s19010130
  9. O.O Olakanmi, "A Multidimensional Walking Aid for Visually Impaired Using Ultrasonic Sensors Network with Voice Guidance," I.J. Intelligent Systems and Applications(IJISA), vol. 6, no. 8, pp. 53-59, 2014. https://doi.org/10.5815/ijisa.2014.08.06
  10. L. Chen, J. Su, M. Chen, W. Chang, C. Yang and C. Sie, "An Implementation of an Intelligent Assistance System for Visually Impaired/Blind People," in Proc. of 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, pp. 1-2, Jan 2019.
  11. B. Jiang, J. Yang, Z. Lv and H. Song, "Wearable Vision Assistance System Based on Binocular Sensors for Visually Impaired Users," IEEE Internet of Things Journal(IoT-J), vol. 6, no. 2, pp. 1375-1383, 2019. https://doi.org/10.1109/JIOT.2018.2842229
  12. M. Zhou, W. Li and B. Zhou, "An IoT System Design for Blind," in Proc. of The 14th Web Information Systems and Applications Conference (WISA), Liuzhou, China, pp. 90-92, Nov 2017.
  13. V. Bhatnagar, R. Chandra and V. Jain, "IoT Based Alert System for Visually Impaired Persons," in Proc. of Emerging Technologies in Computer Engineering: Microservices in Big Data Analytics. ICETCE 2019, pp. 216-223, May 2019.
  14. LA Metro, "Wayfindr LA Metro Trial Report,". [Online]. Available: http://www.wayfindr.net/wp-content/uploads/2020/01/Wayfindr-LA-Metro-Trial-Report.pdf, Accessed on 20-05-2021, 2019.
  15. Md. Adil, T. Rafa, J. Ferdoush, A. Mahmud and A. Pathak, "An IoT based Voice Controlled Blind Stick to Guide Blind People," International journal of engineering inventions(IJEI), vol. 9, no. 1, pp. 9-14, 2020.
  16. M.S. Abdul, J. Ebin, P.M. Shibil, and A. Akmal, "Effective Fast Response Smart Stick for Blind People," International journal of engineering research & technology (IJERT) RTESIT - 2019, vol. 7, no. 8, pp 1-7, 2019.
  17. M. Dimpal, P. Meshram, S. Bhelkar, S. Padhye, S. Kotangale and R. Deshmukh, "Blind's Mate - A Navigation System for Blind People," International Journal of Scientific Research in Science and Technology(IJSRST), vol. 6, no. 2, pp. 230-235, 2019. https://doi.org/10.32628/IJSRST196244
  18. M. P. Agrawal and A. R. Gupta, "Smart Stick for the Blind and Visually Impaired People," in Proc. of 2018 Second International Conference on Inventive Communication and Computational Technologies (ICICCT), Coimbatore, Coimbatore, India, pp. 542-545, Sep 2018.
  19. N. Loganathan, K. Lakshmi, N. Chandrasekaran, S. R. Cibisakaravarthi, R. H. Priyanga and K. H. Varthini, "Smart Stick for Blind People," in Proc. of the 6th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, pp. 65-67, Apr 2020.
  20. S. Subbiah, S. Ramya, G. Parvathy Krishna and S. Nayagam, "Smart Cane For Visually Impaired Based On IOT," in Proc. of the 3rd International Conference on Computing and Communications Technologies (ICCCT), Chennai, India, pp. 50-53, Feb 2019.
  21. L.T. Choong, and M.K.V. Vardhana Reddy, "Smart Assisting Visually Impaired Stick (SAVIS)," International Journal of Engineering Research & Technology (IJERT), vol. 8, no. 12, pp.78-84, 2019. https://doi.org/10.17577/IJERTV8IS120045
  22. N. S. Khan, A. A. Gaud, H. R. Khanvilkar and R. N. Shelar, "Design and modification of smart stick for visually impaired with stick finder," International Journal of Advance Research, Ideas and Innovations in Technology(IJARIIT). vol. 6, no. 2, pp. 326-328, 2020.
  23. Md. Wahidur, R. Islam and M. Harun-ar-Rashid, "IoT based Blind Person's Stick," International Journal of Computer Applications (IJCA), vol. 182, no. 16, pp. 19-21, 2018. https://doi.org/10.5120/ijca2018917824
  24. H. Sharma, M. Tripathi, A. Kumar and M. S. Gaur, "Embedded Assistive Stick for Visually Impaired Persons," in Proc. of the 9th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Bangalore, pp. 1-6, Oct 2018.
  25. W. J. Chang, L. B. Chen, M. C. Chen, J. P. Su, C. Y. Sie and C. H. Yang, "Design and Implementation of an Intelligent Assistive System for Visually Impaired People for Aerial Obstacle Avoidance and Fall Detection," IEEE Sensors Journal, vol. 20, no. 17, pp. 10199-10210, 2020. https://doi.org/10.1109/JSEN.2020.2990609
  26. M. Ramamurthy, V. Naveen, K. Mithun, and R. Nivas, "IOT Based Assistive Technologies for Visually Impaired Persons," International Journal of Pure and Applied Mathematics (IJPAM), vol. 119, no. 12, pp 647-652, 2018.
  27. A. Ashraf, S. Noor, M. Arslan Farooq, A. Ali, and A. Hasham, "Iot Empowered Smart Stick Assistance for Visually Impaired People," International Journal of Scientific and Technology Research (IJSTR), vol. 9, no. 10, pp 356-360, 2020.
  28. K. Niveditha, P D. Kavya, and P. Nivedha, "Virtual Eye for Blind using IOT," International Journal of Engineering Research & Technology (IJERT), vol. 8, no. 11, pp. 116-121, 2020. https://doi.org/10.22214/ijraset.2020.30831
  29. P. S. Ranaweera, S. H. R. Madhuranga, H. F. A. S. Fonseka and D. M. L. D. Karunathilaka, "Electronic travel aid system for visually impaired people," in Proc. of the 5th International Conference on Information and Communication Technology (ICoIC7), Melaka, Malaysia, pp. 1-6, May 2017.
  30. N. A. Kumar, Y. H. Thangal and K. S. Beevi, "IoT Enabled Navigation System for Blind," in Proc. of 2019 IEEE R10 Humanitarian Technology Conference (R10-HTC)(47129), Depok, West Java, Indonesia, pp. 186-189, Nov 2019.
  31. H. S. Al-Khalifa, "Utilizing QR Code and Mobile Phones for Blinds and Visually Impaired People," in Proc. of LNCS 5105, pp. 1065-1069, 2008.
  32. Y. Zhao, E. Kupferstein, B.V. Castro, S. Feiner, and S. Azenkot, "Designing AR Visualizations to Facilitate Stair Navigation for People with Low Vision," in Proc. of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST '19), New York, NY, USA, pp. 387-402, Oct 2019.
  33. S. Agrawal, S. Vaval, V. Chawla, Ms. N. Agrawal, and Ms. K. Namdev, "Smart Blind Helping Stick Using IoT and Android," International Research Journal of Modernization in Engineering Technology and Science(IRJMETS), vol. 2, no 4, pp. 1045-1049, 2020.
  34. M.M. Rahman, M.M. Islam, S. Ahmmed, and A.K., Saeed, "Obstacle and Fall Detection to Guide the Visually Impaired People with Real Time Monitoring," SN COMPUT. SCI., vol. 1, no. 219, pp 1-10, 2020. https://doi.org/10.1007/s42979-019-0007-y
  35. AJ. Ramadhan, "Wearable Smart System for Visually Impaired People," Sensors (Basel), vol. 18, no. 3. pp 1-13, 2018. https://doi.org/10.3390/s18030843
  36. A. Anmol, Gayatri Sakya, and Suyash Verma, "Internet of Things Care Device for Visually Impaired and Old People," Journal of Information Technology Management((JITM)), Special Issue: Security and Resource Management challenges for Internet of Things, Vol. 14, pp. 132-146, 2022.
  37. A. Abusukhon, "IOT Bracelets for Guiding Blind People in an Indoor Environment," Journal of Communications Software and Systems(JCOMS), vol. 19, no. 2, pp. 114-125, 2023.  https://doi.org/10.24138/jcomss-2022-0160
  38. W. Chen, Z. Jiang, H. Guo, and X. Ni, "Fall Detection Based on Key Points of Human-Skeleton Using OpenPose," Symmetry, vol. 12, no. 5, pp. 744, 2020.
  39. S. Hosseini, "Getting Started w/ NodeMCU ESP8266 on Arduino IDE," Accessed on. 09-09-2021. [Online]. Available: https://electropeak.com/learn/nodemcu-esp8266-on-arduino-ide/
  40. J.Hrisko, "Maker Portal- Capacitive Soil Moisture Sensor Calibration with Arduino," 2019, Accessed on. 10-10-2021. [Online]. Available: https://makersportal.com/blog/2019/5/12/arduino-weighing-scale-with-load-cell-and-hx711
  41. "Google Cloud," Accessed on. 16-07-2023. [Online]
  42. "Tiny Tronics," Accessed on. 22-07-2023. [Online]. Available: https://www.tinytronics.nl/shop/en/lilygo-ttgo-esp8266-with-0.91-inch-oled-display#:~:text=Description,micro%20USB%20cable%20not%20included
  43. Random Nerd Tutorials, "Getting Started with LILYGO T-SIM7000G ESP32 (LTE, GPRS, and GPS)," Accessed on. 22-07-2023. [Online]. Available: https://randomnerdtutorials.com/lilygo-t-sim7000g-esp32-lte-gprs-gps/
  44. A. Abdel Qader, "A New Novel Hybrid Dynamic Color Segmentation Model for Road Signs in Noisy Conditions," International Journal of Software Innovation (IJSI), vol. 9, no. 3, pp. 1-22, 2021. https://doi.org/10.4018/IJSI.2021070101