Skip to main content
Heliyon logoLink to Heliyon
. 2024 Feb 29;10(5):e26987. doi: 10.1016/j.heliyon.2024.e26987

Feedback control of automatic navigation for cyborg cockroach without external motion capture system

Mochammad Ariyanto a,b, Chowdhury Mohammad Masum Refat a, Kotaro Yamamoto a, Keisuke Morishima a,
PMCID: PMC10915385  PMID: 38449606

Abstract

Due to their size and locomotion ability, cockroaches are favorable as hybrid robot platforms in search and rescue (SAR) missions. However, cockroaches most likely approach the corner area and stay for an uncertain time. This natural behavior will hinder the utilization of cyborg cockroaches in SAR missions under rubble, unstructured, and unknown areas. Therefore, we proposed onboard automatic obstacle avoidance and human detection that can run on the wireless backpack stimulator without an external motion capture system. A low-power and small-size Time of Flight (ToF) sensor was selected as a distance measurement sensor, while a low-resolution thermopile array sensor was applied for human presence detection. The implemented feedback control based on IMU and ToF sensors has successfully navigated the cyborg cockroach to avoid obstacles and escape from the sharp corners in the laboratory unstructured area without stopping or being trapped. It could also recognize the human presence when the human was in front of it in real-time. Due to its performance, the random forest classifier was implemented as an embedded human detection system. It could achieve the highest accuracy at a distance of around 25 cm (92.5%) and the lowest accuracy at about 100 cm (70%).

Keywords: Cyborg insect, Obstacle avoidance, Human detection, Embedded machine learning

1. Introduction

Insects have been utilized as computer-hybrid robots or cyborg insects for urban search and rescue (SAR) as the primary goal by combining living insects with a mechatronics system. This hybrid cm-scale robot has excellent locomotion and agility in an unknown environment. Cockroaches have been selected as the main platform of hybrid robots due to their size and locomotion [[1], [2], [3], [4], [5], [6], [7], [8], [9], [10]]. If overturned, they can self-right on the ground and climb on the obstacle [11]. They can avoid or climb obstacles and follow the wall of obstacles. These cockroach abilities make them ideal to be utilized as a cyborg insect platform for SAR missions under rubble and unknown environments.

Besides the advantages of cockroaches in terms of agility and locomotion, some natural characteristics of cockroaches pose a challenge to be utilized as a hybrid robot for real-time utilization in SAR missions. Cockroaches are attracted to dark/sheltered areas and stay still after reaching these areas [12]. They also tend to walk to narrow spaces or corner areas. After reaching a dark place, narrow space, or corner area, they will stay still/not move for some period [[13], [14], [15]], as depicted in Fig. 1(a). They prefer to walk along walls or obstacles and mostly stop for some time if there is a corner. Their movement is also sluggish at a lower temperature (lower than 70 °F) [16]. To overcome these issues, researchers have studied how to optimize the cockroach hybrid robot motion by implementing certain algorithms and inertial measuring unit (IMU) sensor. IMU is the most widely used sensor applied to insect hybrid robots to capture the kinematics of the hybrid robots [1,2,4,9,[17], [18], [19], [20], [21], [22], [23], [24]]. The previous study implemented predictive feedback control based on the linear and angular velocity obtained from the IMU sensor to the cockroach hybrid robot. By implementing this strategy, the cockroach could be prevented from stopping in the area near the wall of tall obstacles [9]. Machine learning technique incorporated with IMU sensor has been implemented to optimize the movement of cyborg cockroaches in a circular bounded space by applying automatic stimulation to the cerci organ. The study demonstrated that the cyborg cockroach could avoid obstacles/walls and follow them autonomously in the bounded space. The movement of the cockroach could be successfully increased while the stimulation time was reduced to prevent fatigue of the cockroach [1].

Fig. 1.

Fig. 1

Natural cockroach and hybrid robot movements in unstructured environments: (a) Natural cockroach walks by approaching the wall of the obstacle/rocks from the green area (starting point). The red dashed arrow indicates the cockroach path. After reaching the wall, it follows it, approaches the corner, and stays still for an uncertain time. This natural behavior will hinder the utilization of cyborg cockroaches in SAR missions. (b) A wireless backpack with IMU and ToF distance sensors is mounted on the cockroach. By implementing the feedback loop from the sensors, the cyborg cockroach can avoid obstacles and escape from sharp corners.

However, research on cyborg insect obstacle avoidance in the corner area has not been extensively explored and studied. The cockroach will likely go to the corner and stay still for a certain period. Previous studies utilized simple manual obstacle avoidance operations by turning left/right using antenna stimulation to avoid the obstacle in front of the cyborg [2,25]. A camera/motion capture system has been implemented as a feedback position sensor for the autonomous navigation of the cyborg cockroach for path following trajectory in the free space area [10,20]. Implementing a camera as position feedback and detecting the obstacle in front of the cyborg in the SAR mission will be difficult. LiDAR (Light Detection and Ranging) cannot be attached to the cyborg cockroach due to its large size and high power requirement. Therefore, integrating a small and low-power distance sensor on the cyborg will be a suitable solution for the onboard automatic obstacle avoidance in the corner area, as shown in Fig. 1(b).

Human detection is one of the most critical factors for implementing cyborg insects in SAR missions. A wireless camera has been developed and implemented for the cyborg cockroach [8]. It sent streaming live video via Wi-Fi (Wireless Fidelity) to the PC. It is suitable for SAR because the cyborg insect can be steered manually, and the human can be recognized from a PC without a human detection algorithm. However, this system consumes a lot of power (80 mA–260 mA), and it is not suitable under a dark area or without light [8,26]. Iyer et al. developed a low-power mini camera mounted on the beetle. The obtained live video could be sent via Bluetooth. However, the manual operation to guide the beetle on the desired trajectory/path has not been studied and explored [27]. Eric et al. implemented directional and omnidirectional microphones into a cyborg cockroach as sound localization sensors for SAR missions. This system was able to steer the cockroach toward the sound source. Still, it will be challenging to be embedded in an onboard wireless backpack due to heavy computation and high-power consumption (63 mW for microphone array) [6,28]. Phuoc et al. developed onboard human detection mounted on the cyborg cockroach using an infrared (IR) thermopile array sensor. This sensor requires 21.45 mW average and outputs 32 × 32 resolution thermal image with 90° × 90° field of view. A Histogram of Oriented Gradients (HOG) and support vector machine (SVM) were selected as the feature extraction and classifier from the obtained thermal image. The human detection algorithm on the onboard system obtained 87% accuracy [9]. HOG generates large vectors of features from an input image, and the computation is quite complex. The steering decision for the cyborg was calculated based on the external motion tracking system.

Developing an intelligent cyborg cockroach that can detect human presence and avoid sharp corners will be vital in SAR applications. This study presents the first demonstration of a cyborg cockroach capable of onboard obstacle avoidance, embedded human detection, and wireless communication without external tracking or computer vision feedback. An intelligent low-power wireless backpack was equipped with cockroach stimulation control, IMU sensor, Time-of-Flight (ToF) distance sensor, and mini 8 × 8 thermopile array sensor. An interactive user interface (UI) was developed to ease cyborg insect operation. Eight features and a random forest classifier were applied to process the obtained 8 × 8 thermal image. This human detection algorithm was embedded into the 32-bit microcontroller on the wireless backpack. Simple feedback control for onboard obstacle avoidance was implemented to automatically stimulate the cockroach's antenna for turning motion based on the obtained obstacle distance. Automatic feedback stimulation to the cerci was provided based on the linear acceleration obtained from the IMU sensor to prevent the cockroach from stopping or being trapped in the corner/narrowed area. The proposed hybrid robot with a compact size of the intelligent backpack was aimed to automatically avoid obstacles, especially corners or narrowed areas. Moreover, the cyborg insect was applied to recognize the human presence and send the obtained data/recognition result to the remote PC in real-time.

2. Material and methods

2.1. Cyborg insect

One of the main challenges for the cyborg insect is the wireless stimulator backpack. Wireless communication is vital in sending sensor data to the PC and receiving commands/data from the remote PC. In this study, the SMD version of nRF24L01 was selected as the wireless module due to its simplicity and low power consumption. GY-955 9-axis IMU consisting of accelerometer, gyroscope, and magnetometer was chosen to measure the kinematics of the cyborg cockroach. A small-size 32-bit microcontroller (48 MHz, 32 KB of SRAM, and 256 KB of flash memory) from Seeed Studio (21 × 17.5 mm) was applied for the central computation of the proposed intelligent backpack. Time-of-Flight (ToF) laser-ranging module (VL53L0X) from STMicroelectronics was selected to measure the obstacle distance in front of the cyborg insect. This tiny sensor (4.4 mm × 2.4 mm x 1.0 mm) can provide accurate distance measurement. This ToF laser ranging sensor consumes 20 mW, making it suitable for onboard obstacle avoidance. Human detection was performed using a low-resolution thermophile array sensor (AMG8833) from Panasonic. It can measure thermal images with 8 × 8 pixels with a viewing angle of 60° and a power consumption of 14.85 mW. The thermal image sensor dimension is (12.6 mm × 9.0 mm x 4.7 mm) and it was joined with a wireless backpack vertically. The obstacle distance and thermal image sensors must be mounted vertically on the backpack with a printed circuit board (PCB). Minimum electronics components for these two sensors were used to develop small-size PCBs (Fig. 1, Supplementary Information) that can be vertically connected to the backpack. Other custom PCBs were developed to join the wireless module, stimulation terminals, and IMU sensor (Fig. 2, Supplementary Information). The final assembly of the wireless backpack that can be mounted on the cyborg cockroach is depicted in Fig. 2. The total weight and size of the backpack, excluding the battery are 6.92 gr and 36 mm × 12 mm x 7 mm respectively. Based on our observation, the cockroach can easily lift the proposed backpack without difficulty. A 30 mAh lithium polymer (LiPo) battery with a size of 11.5 mm × 19.6 mm x 3.9 mm and weight of 1.32 gr was applied to power the backpack. The battery can provide continuous power for around 25 min.

Fig. 2.

Fig. 2

Developed cyborg insect: (a) Wireless intelligent backpack. The obstacle avoidance algorithm employs feedback measurement of obstacle distance from the ToF distance sensor and longitudinal acceleration from the IMU sensor. Human presence detection applies embedded machine learning using measured thermal images from a low-resolution thermopile array sensor. (b) Cyborg cockroach with the backpack. A low-power wireless intelligent backpack incorporated with human presence detection and obstacle avoidance is mounted on the cockroach as a cyborg insect platform.

Madagascar Hissing Cockroach (Gromphadorhina portentosa), with a length of around 6 cm was selected as the hybrid robot platform. A platinum wire (A-M systems) with a diameter of 0.127 mm was implemented as the electrode and inserted into the antennae, thorax, and cerci. We used the same procedure from the previous study for implanting electrodes into cockroaches [2,24]. A copper wire (32 AWG) was utilized as a wire connector between the platinum electrode and the female header pin. A square waveform signal of 3.3V with a 50% duty cycle and frequency of 50 Hz was generated by the microcontroller and transmitted to the stimulation pins. The ground/reference electrode was implanted in the thorax with a depth of about 5 mm. The monophasic stimulation signals were given to antennae and cerci. Stimulation on the right antenna will cause the cockroach to turn left and vice versa. Meanwhile, move forward movement can be achieved by stimulating the cerci.

The cyborg cockroach was tested for onboard obstacle avoidance and human presence detection in an unstructured environment consisting of corners and tall obstacles. This system was developed without an external camera or motion tracking system. The human presence detection algorithm was embedded into a low-power 32-bit microcontroller. The overall cyborg system is shown in Fig. 3. An interactive user interface (Fig. 3, Supplementary Information) was developed under Python programming for the easy and user-friendly operation of the proposed cyborg insect system. Obtained measurements from IMU and ToF sensors as well as the calculated features from the thermal image sensor were sent to the UI via wireless receiver. The operation mode (automatic or manual) of the cyborg insect could be selected through the developed UI.

Fig. 3.

Fig. 3

Overall system overview. The system consists of a cyborg cockroach with a wireless intelligent backpack, a user interface, wireless transmitter-receiver, and an experimental field. Measured sensor data and human classification results are wirelessly sent to the PC via a wireless receiver. The manual or automatic operation of a cyborg cockroach can be selected on the UI. The experimental field includes corners and obstacles. The cyborg cockroach is navigated by a feedback control loop to avoid obstacles and escape from the corners. When it escapes from the experimental field, it recognizes a human presence in front of it. The black dashed line indicates the cockroach trajectory from the starting area (green circle).

2.2. Embedded human recognition

The human detection system is essential in finding the survivor under the rubble of a structural collapse due to an earthquake. An infrared (IR) thermopile array sensor will be suitable for this mission since it can work under lightless conditions or at night. It measures the temperature difference between the human body and the surroundings. Sending temperature images from the backpack to the PC continuously via a wireless communication device will drain a lot of power. Therefore onboard human detection is preferred running on the wireless backpack [9]. Developing onboard human detection on the low-power wireless backpack will be challenging since it needs to adjust according to the limited size, low power consumption, and low-computational algorithm on the electronic backpack. Most of the previous research studies for human detection or human activity recognition utilized large size of IR thermophile sensor modules, and the algorithms were run on single-board computers (SBC) or personal computers (PC) [[29], [30], [31], [32], [33], [34], [35], [36], [37], [38], [39]]. High computational resources such as machine learning with advanced features and deep learning can run smoothly on the SBC and PC without worrying about size, power, and complex computation.

In this study, onboard human detection running on the electronic backpack was developed using an extremely low-resolution IR thermopile array sensor and machine learning. A low-power 32-bit microcontroller (48 MHz, 32 KB of SRAM, and 256 KB of flash memory) was applied for the computation center of onboard human detection. The human detection algorithm needs to be an efficient program and a small size in computation to be embedded in the microcontroller. The obtained 8 × 8 pixels of thermal images need to be processed with time-domain features to extract meaningful information. The 2D array of 8 × 8 pixels (Fig. 4(a)) obtained from the thermopile array sensor was converted into a 1D array (Fig. 4(b)) of sensor data. The obtained 1D array of thermal image (Xt) is expressed in Equation (1), where n is the pixel number of the thermal image. Eight-time domain features were chosen as the feature extraction for the 1D array of thermal image (Xt). These time domain features combined with machine learning were utilized in the finger/human hand motion based on the electromyography signals with high accuracy [40,41]. After calculating the eight-time domain features, embedded machine learning will be applied to classify between human or environment/no human. The proposed of onboard human detection flow diagram is presented in Fig. 4(c).

Xt=[xix1x2.......xn1xn] (1)

Fig. 4.

Fig. 4

Human detection method: (a). 2D array of 8 × 8 pixels. (b) Converted 1D array, (c) Onboard human recognition. A thermal image of a human face (64 pixels) was measured at a distance of 0.5 cm. This 2D array of pixels was converted into a 1D array for feature extraction and calculation purposes. Machine learning classifier model was embedded in a low-power electronic backpack for human presence detection. The calculation of eight features and classification by machine learning were run on the wireless backpack in real-time.

All computations for obtaining 2D/1D thermal images, calculating the feature extractions, and classifying human by machine learning must be able to be embedded and run on the 32-bit microcontroller with limited memory of computational resources. For the feature calculation, eight-time domain features were selected to process the 64 pixels of a 1D array of thermal image (Xt) at once on the microcontroller. The eight-time domain selected features are summarized in Table 1. Variance, mean absolute deviation, coefficient of variance, average amplitude change, Wilson amplitude, log coefficient of variation, log difference absolute standard deviation value, and wavelength were extracted directly from the thermopile array sensor based on the equations in Table 1. These eight features were directly processed as the input vector of machine learning for human detection. For the data collection, these input features were wirelessly sent from the backpack to the computer via the developed UI presented in (Fig. 3, Supplementary Information).

Table 1.

Selected eight features for human presence detection. These eight features were embedded in the wireless backpack stimulator to obtain significant feature information from a 1D array of measured thermal image.

Features Equation
Variance (VAR) VAR=1n1i=1n(xiμ)2
Where μ=1ni=1nxi
(2)
Mean absolute deviation (MAD) MAD=1ni=1n|xiμ| (3)
Coefficient of variance (CoV) CoV=1n1i=1n(xiμ)2μ (4)
Average amplitude change (AAC) AAC=1ni=1n1|xi+1xi| (5)
Wilson amplitude (WA) WA=i=1n1f(xi),f(xi)={1,if|xixi+11.25|0,otherwise (6)
Log coefficient of variation (LCoV) LCoV=log(CoV) (7)
Log difference absolute standard deviation value (LDASDV) LDASDV=log(1n1i=1n1(xi+1xi)2) (8)
Wavelength (WL) WL=i=2n|xixi1| (9)

Before deciding the most suitable embedded machine learning model for onboard human detection, eight calculated features were trained using decision tree (DT), random forest (RF), K-nearest neighbors (KNNs), and support vector machine (SVM). Four classifiers were implemented on the offline human detection method under Python programming. DT breaks down all training feature datasets into several subsets. The Gini index was chosen to split the decision tree. For the RF classifier, the number of decision trees built in the forest was selected with a small number of ten trees. The number of estimators should be applied at around ten due to the limited 32-bit microcontroller computation source. KNN and SVM have been utilized extensively in classification problems. KNN is fast in terms of training process computation. In this study, KNN was applied with Minkowski distance. SVM with Linear kernel was implemented on the offline human detection classification due to fast computation.

The binary classification performances of the four classifiers were evaluated using accuracy, precision, recall, and F1 score. These four metrics were obtained from the testing dataset of the collected thermal data. The true positive (TP), true negative (TN), false positive (FP), and false negative (FN) were computed as expressed in Equation (10) to Equation (13), respectively. The classifier that generated the highest performance was utilized as the embedded human detection running on the low-power wireless electronic backpack.

Accuracy=TP+TNTP+TN+FP+FN (10)
Precision=TPTP+FP (11)
Recall=TPTP+FN (12)
F1score=2(Recall×Precission)Recall+Precission (13)

2.3. Obstacle avoidance

Cockroaches can avoid obstacles using their antennae as tactile sensors without providing commands from a remote operator. This ability will lead the cockroach to follow the wall of obstacles. However, cockroaches will tend to approach and stay in a narrow path, dark area, or corner area for a specific time. Based on our previous study [2], the cyborg cockroaches ignored some stimulation commands given to the antenna for turning motion if their head touched the obstacle wall. Therefore, incorporating a cyborg cockroach with onboard obstacle avoidance is crucial for utilizing this hybrid robot in SAR missions. In this study, the cockroaches' response, if placed in front of the sharp corner, was conducted using a simulated experimental test with multiple corners and obstacles, as shown in Fig. 5. The cockroaches were placed on a green circle in front of the corner with an angle of 55°. The distance between the edge of the green circle (diameter = 12.5 cm) and the corner tip is 11 cm. Seven cockroaches were put in the center of the green circle and faced the center of the corner with 100 trials to observe the cockroaches' response in the front of the sharp corner (N = 7 cockroaches, n = 100 trials).

Fig. 5.

Fig. 5

Experiment test for obstacle avoidance. Left: The cyborg cockroach was placed on the middle of the green circle area with a distance of around 18 cm from the corner tip to the cockroach. Right: The experimental arena consisted of six corners, nine large obstacles/walls, and five tall square blocks. The cyborg cockroach was navigated to escape from the sharp corners and avoid obstacles without stopping or being trapped.

To avoid the obstacle, especially in the sharp corner, a small size and low-power Time-of-Flight (ToF) laser-ranging sensor was applied for onboard automatic obstacle avoidance. For a better resolution, the sensor was set with a minimum distance of 2 cm and a maximum distance of 120 cm. The measured distance (dm) was processed with a low-pass filter embedded in the microcontroller. For obstacle avoidance, if the desired obstacle distance (ds) was between 3 cm and 17 cm, the system automatically stimulated one of the antennae. This stimulation would lead the cockroach to turn left/right. If the obstacle distance was higher than 17 cm or there was no obstacle in front of the cyborg (obstacle distance is 1.2 m), the system would stop to stimulate the antenna. This distance range threshold (ds) was chosen because the cockroach sometimes ignored the repeated stimulation given to one of its antennae. The cockroach would have enough distance to avoid and escape from the sharp corner without touching or hitting the corner wall.

We found that sometimes, based on our previous study, solely stimulating the antennae was sufficient in steering the cockroach but insufficient in keeping them in motion [2]. Automatic stimulation given to the cerci was applied in the feedback loop based on the IMU reading to prevent stopping motion, as presented in Fig. 6. Longitudinal acceleration (ax) obtained from the IMU sensor was implemented to detect the longitudinal movement of the cockroach. This longitudinal acceleration was noisy and difficult to implement directly to distinguish the cockroach motion between stop and walk [1]. In this study, a simple parameter (adiffsum) was implemented to activate automatic stimulation to the cerci as expressed in Equation (11). Based on the experiment, the threshold value (atreshold) was selected as 0.1 m/s2 for the cerci automatic stimulation feedback to prevent the stopping motion of the cyborg cockroach.

adiffsum=i=09|ax(i+1)ax(i)| (14)

Fig. 6.

Fig. 6

Proposed feedback loop for obstacle avoidance system. The obstacle avoidance system employed measured obstacle distance from the ToF sensor. If the measured obstacle distance (dm) is higher than 3 cm or below 17 cm (ds), the cockroach avoided the obstacle by turning left/right until there was no close obstacle in front of it. To prevent the cockroach from stopping, automatic stimuli on the cerci were applied using measured longitudinal acceleration feedback (ax) from the IMU sensor. By implementing this control strategy, the cyborg cockroach would always walk, avoid obstacles, and escape from sharp corners.

3. Results and discussion

For the data collection for human detection with four classifiers (RF, DT, SVM, KNN), eight features were calculated by microcontroller from the thermal image and sent to the remote PC wirelessly. The feature datasets were collected in the experimental test, as shown in Fig. 5. A person sat in front of the wireless backpack for data features collection of human class from 25 cm to 1 m. While the environment or “no human” class was acquired around the experimental test. The temperature for data collection was kept from 24 °C to 26.5 °C. Feature datasets (12,062 × 8) were utilized for offline machine learning human detection. They were split into 70% for training and 30% for testing.

Four classifiers were trained using hyperparameters, as presented in Section 2.2. The testing performance was calculated using 30% of the feature dataset. The confusion matrix for each classifier is presented in Fig. 7. It shows that the accuracy of “no human class” is always higher than that of the human class. We believed this could happen due to the noise when the thermal image sensor measured human. The actual measured human thermal decreased as the distance between the human and the thermopile array sensor became longer. The results of the classifier performance are summarized in Table 2. All classifiers generated accuracy, precision, recall, and F1-score with more than 90%, except for linear SVM. Based on the performance result, RF classifier was selected for embedded human detection with machine learning running on the wireless backpack.

Fig. 7.

Fig. 7

Confusion matrix for human recognition. In the offline classification test, RF model classified the binary classification with the highest accuracy (96.79% for no human class, and 95.16 for human class). The test was calculated using 30% of the datasets. The offline classification results were calculated on the computer/PC side. The last column is the calculated precision for that label, and the last row data computes the recall. The diagonal cell in the confusion matrix indicates a correctly identified label. The cell in the last column and the last rows represents the accuracy and the total number of supports for each classifier.

Table 2.

Performance of the classifiers. RF generated the highest performance, therefore it was chosen as the embedded classifier model on the wireless backpack.

Classifiers RF DT SVM KNN
Accuracy 0.9615 0.9473 0.8697 0.9234
Precision 0.9614 0.9474 0.8779 0.9265
Recall 0.9615 0.9473 0.8697 0.9234
F1-score 0.9615 0.9473 0.8708 0.9239

Not all machine learning models can be embedded in the microcontroller due to the limited source, especially RAM and flash memory. MicroML library was applied to embed RF machine learning classifier to a 32-bit microcontroller. This library aimed to run an embedded machine-learning algorithm on an 8-bit microcontroller [42]. Due to the limited memory size on the microcontroller, the RF classifier was implemented with ten trees. By implementing MicroML, the RF classifier could be successfully ported to plain C and embedded in the microcontroller with a size of 1.65 KB. This embedded RF could run smoothly on the proposed backpack with SRAM of 32 KB and flash memory of 256 KB. The average time required to calculate from the feature extraction process to the random forest algorithm for onboard human detection was 7.5 ms. The overall embedded programming (sensor reading, feature extraction, embedded machine learning, wireless data transmission, and cyborg cockroach stimulator) occupied 188,284 bytes (71%) of program storage space/flash memory (262,144 bytes). The onboard RF classifier was evaluated in the experimental test with a temperature from 24 °C to 26 °C. A person sat in front of the backpack with a distance ranging from 25 ± 10 cm to 100 ± 10 cm. The onboard classification results are summarized in Table 3. Based on the results, the closer the distance between the backpack and the human, the higher the performance of the onboard human detection.

Table 3.

Performance of embedded classifier (RF) running on the backpack with various distances. The closest distance between the human and the cyborg insect generates the highest accuracy because it provides the true value with the small bias of thermal image measurement. While the longer distance will decrease the true value of the thermal image measurement.

Distance (cm) 25 ± 10 50 ± 10 75 ± 10 100 ± 10
Accuracy 0.9250 0.8875 0.8250 0.700
Precision 0.9261 0.8897 0.8385 0.7198
Recall 0.9250 0.8875 0.8250 0.7000
F1-score 0.9250 0.8873 0.8232 0.6931

For the real-time onboard human detection test, a developed user interface was implemented for the manual operation of the cyborg cockroach. In the manual operation of the cyborg cockroach, an operator can manually steer the cockroaches through the user interface (UI). For steering the cyborg to turn left or right, a user can provide stimulation to the right or left antennae by pressing the left or right arrow keys on the laptop. For the forward motion, the user can stimulate the cerci by pressing the up arrow key on the keyboard. The cyborg cockroach was put in the green circle area facing the corner. An operator was tasked to steer the cyborg cockroach to avoid the sharp corner and escape from the experiment test. A human sat in front of the experimental test. The video demonstration of the UI for manual operation and real-time onboard human detection can be seen at this link: https://bit.ly/3OiO2Rm. The operator successfully steered the cockroach to escape the corner and exited the experiment area. When the cockroach faced a human sitting in front of the cyborg, the intelligent backpack could successfully detect the human presence as shown in Fig. 8(a) and (b). The manual operation and the stimulation command can be seen in Fig. 8(c). The stimulation on the antenna was provided to steer the cockroach, while the cerci stimulation was given to make the cockroach move forward.

Fig. 8.

Fig. 8

Embedded human recognition (N = 1 cockroach, n = 1 trial). (a) Screenshot of manual operation using developed UI. (b) Screenshot of image when the cyborg cockroach recognized the human presence in front of it. (c) Stimulation input for antennae and cerci vs IMU and ToF sensor outputs. An operator could easily operate and navigate the cyborg cockroach using the UI by pressing certain keys on his keyboard. The command was given by pressing the specific keys on his keyboard. When the human was in front of the cyborg cockroach, it recognized the human presence and sent the recognition result to the UI.

To observe the cockroach response in front of the sharp corner, seven cockroaches were put on the center of the green circle and faced the center of the corner (N = 7 cockroaches, n = 100 trials). The cockroaches would stop and stay still for an uncertain time. In this observation, if the cockroach stopped for more than 3 min, the cockroach would be considered to be in a stop or motionless state. We observed that the cockroach could stop on the sharp corner area for around 9 h. Based on the observation results, the cockroach responses can be classified into four main responses, i.e. (a) stop on the green area, (b) stop on the wall, (c) stop on the corner, and (d) escape from the corner or experimental area as shown from Fig. 9(a) to Fig. 9(d).

Fig. 9.

Fig. 9

Common cockroach responses in front of the sharp corner (N = 7 cockroaches, n = 100 trials. (a) Stop on the green area/starting area, (b) Stop on the wall, (c) Stop on the corner for uncertain time, (d) Escape from the experimental arena, (e) Cockroach percentage response, (f) Escape time (time needed for a cockroach to escape/exit the experimental arena from the green area) and corner time (stop time on the corner). The responses of natural cockroaches are dominated by wall-following behaviors. After placing the cockroach in front of the sharp corner, it would approach the wall and follow along the wall. If the cockroaches were on the corner, they would stay still for a moment or uncertain time. Based on our observation, the cockroach could stay on the corner for around 9 h.

Fig. 9(e) depicts the cockroach percentage responses from the one hundred trials. Cockroaches approached a sharp corner when placed in front of it and remained stationary for an indefinite amount of time (more than 3 min) with a percentage of 46%. Most of the cockroaches responded by approaching and stopping in the nearest corner. The cockroach would rotate, walk a truly short distance, and stay in the green circle area (b) with a percentage of 11%. On some occasions, after being placed in the green area, cockroaches approached the wall and remained still for an indefinite time (more than 3 min), with a percentage of 9%. Responses (b) and (c) occur when cockroaches are lazy or inactive.

The final response of the cockroaches is that they could walk and escape from the experimental area continuously. Most of the cockroach responses in cases Fig. 9(b)–. (c), and Fig. 9(d) are wall-following behavior. If a wall is near the cockroaches, they will likely approach it and walk to follow the obstacle wall, as found in the previous study [1]. The summarized cockroach responses can be seen in this video https://bit.ly/46EaYRN. For the escape response, when the cockroaches did wall-following behavior, they would stop on the corner for a certain amount of time and then continue walking by wall-following behavior or random walk if they were not near the wall. The cockroach's escape time and stop time on the corner are shown in Fig. 9(f). The stopping behavior, especially on the corner, dominated the cockroach responses in the simulated unstructured experimental area. This natural behavior will hinder the utilization of cyborg cockroaches for SAR missions because there are a lot of sharp corners found under rubble after earthquakes or other natural disasters. Therefore, augmenting the cyborg cockroach with onboard obstacle avoidance will play a crucial role in SAR missions to prevent the cockroach from being trapped in a sharp corner or dark area.

The demonstration of automatic obstacle avoidance applied to the cyborg can be seen in this video https://bit.ly/3XFUGUR. The cyborg cockroach could avoid the obstacle suddenly put in front of it. The cyborg cockroach could stop walking, although the antenna was stimulated to trigger walking motion. No cockroach movement could happen when the cockroach was not eager to move. Therefore, automatic stimulation was augmented to the cyborg to prevent sudden stopping or a motionless state.

The cyborg cockroach was placed on the center of the green circle facing the center of the sharp corner with an angle of 55°. The test was conducted for 37 s. The results of the obstacle avoidance and stop-walking prevention are presented in Fig. 10. This test implemented obstacle avoidance by automatically stimulating the right antenna. The cockroach trajectory path is depicted with blue color while the antenna stimulation from the obstacle avoidance is shown with red color. Automatic stimulation activation to the right antenna was given with a waveform signal with a frequency of 1 Hz and a duty cycle of 50%. Based on Fig. 10(a), the cockroach successfully avoided and escaped from the cornered area by turning left. The cockroach started walking randomly after no automatic stimulation due to no obstacle in front of it. To prevent the cockroach from stopping, stimulation was provided to the cerci from the feedback loop to make the cockroach move forward. Cerci were stimulated if the measured longitudinal acceleration of the cockroach was below the threshold value. The attitude angles of the cockroach response measured from IMU, and the distance obstacle obtained from the ToF sensor are presented in Fig. 10(b) and (c), respectively. The figures clarify that when the distance between the obstacle and the cyborg was under the distance range, the cyborg avoided the obstacle by turning left. The cockroach always walked without stopping by applying automatic stimulation to the cerci.

Fig. 10.

Fig. 10

Cockroach response for automatic obstacle avoidance employing fixed right antenna stimulation (N = 1 cockroach, n = 1 trial). (a) Cyborg cockroach trajectory, (b) Stimulation vs. attitude angles, (c) Stimulation vs. measured obstacle distance and linear acceleration (adiffsum), the light green color shows the threshold limit for distance and linear acceleration. The stimulated antenna coming from the onboard obstacle avoidance is colored red, while the blue one depicts the cockroach trajectory without antenna stimulation. The cockroach successfully escaped from the sharp corner and avoided obstacles without stopping or becoming trapped on the experimental arena.

The uncomplicated test shows that the onboard automatic obstacle avoidance and automatic cerci stimulation successfully steered the cockroach to escape the corner area/wall and continuously walked without stopping. The proposed feedback loop based on the IMU and ToF sensor reduces the wall-following behavior of the utilized cockroach. By decreasing the wall-following behavior of the cockroach, the explored area will be larger. It will be suitable for unstructured and unknown areas where many cornered or dark areas make the natural cockroach stopped or trapped for an uncertain time.

The proposed automatic obstacle avoidance was tested using seven cyborg cockroaches and repeated 34 times without changing the initial position. The test was divided into two fixed stimulation groups: 1) escape the cockroach from the sharp corner/unstructured area using fixed stimulation to the right antenna (N = 7 cockroaches, n = 17 trials), 2) escape from the unstructured simulated area applying fixed stimulation to the left antenna (N = 7 cockroaches, n = 17 trials). The test was performed for 150 s at room temperature in the laboratory. The cockroaches were put on the center of the green circle and faced the corner. The cockroaches' response trajectories are presented in Fig. 11.

Fig. 11.

Fig. 11

Automatic obstacle avoidance results using fixed stimulation to right and left antennae (N = 7 cockroaches, n = 34 trials). (a) Response of the cyborg to fixed right antenna stimulation (N = 7 cockroaches, n = 17 trials). (b) Response of the cyborg to fixed left antenna stimulation (N = 7 cockroaches, n = 17 trials). In both tests (a) and (b), the cyborg cockroaches successfully escaped from a sharp corner area and avoided the obstacles without hitting the obstacles and stopping near the obstacle wall. The cyborg cockroach could detect human presence after exiting the experimental arena. The green area indicates the starting area, blue line, and red line are the cyborg cockroach trajectory and the given antenna stimulation.

For the fixed stimulation on the right antenna as shown in Fig. 11(a), the cockroach avoided and escaped the sharp corners by turning left. After there was no obstacle in front of the cockroach, it walked freely (random-walk). Stimulation on the cerci was given to trigger free walking motion if the cockroach stopped. All cyborg cockroaches could be successfully navigated to avoid the sharp angle/corner in front of them. The proposed system successfully navigated the cyborg cockroaches to avoid the obstacle/sharp corners and leave the experimental area with a success rate of 82.3 % (3 trials out of 17 trials avoided the sharp corner and obstacle but could not leave the experimental area within 150 s).

The cockroach turned right by applying fixed feedback stimulation to the left antenna in front of the sharp corner/initial position. After turning right with an obstacle/wall distance of more than 17 cm, the feedback loop stimulation was provided to the cerci to trigger forward walking motion. The cockroach approached the wall, and feedback stimulation to the antenna was given to avoid the second corner. The cockroach could avoid the two corners and escape the simulated unstructured experimental test, as shown in Fig. 11(b). The cockroach avoided and escaped from the nearest sharp corner by turning right and walking away. In the second obstacle, the cockroach turned left instead of right to avoid the obstacle wall and walked away. The cockroach avoided the last obstacle and successfully escaped from the experimental area. By implementing the proposed feedback loop to the left antenna and cerci, the cockroach avoided the obstacle and always walked without stopping or being trapped in the corner. Based on the test results for fixed stimulation to the left antenna, the cyborg cockroaches successfully avoided all obstacles/corners and left the experimental area within 150 s with a success rate of 94.1% (1 trial out of 17 trials, the cyborg cockroach could avoid the sharp corner and obstacle, but it could not leave the experimental area within 150 s). All results for both tests (fixed stimulation to the right and left antennae (N = 7 cockroaches, n = 34 trials) can be seen directly in this video: https://bit.ly/3FoA37y.

The fixed stimulation on the left antenna was implemented for automatic obstacle avoidance and onboard human detection for the final test. The embedded machine learning algorithm running on the backpack in real-time successfully detected the human presence at a short distance (less than 110 cm). The test results demonstrate that our proposed onboard obstacle avoidance incorporating embedded human recognition could be a potential utilization of cyborg insects in SAR missions without external cameras/computer vision. The intelligent backpack successfully recognized a human presence in real-time. The video demonstration of onboard human detection and obstacle avoidance can be seen online in the video (https://bit.ly/3OkkW44).

Based on the final test for automatic obstacle avoidance and onboard human detection, real-time embedded machine learning and obstacle avoidance algorithms running on the backpack effectively navigated the cyborg insect to avoid the obstacles and detected the human presence at a short distance in a controlled temperature (24 °C–26 °C). However, the applicability of the proposed system has not been tested in real case under more complex environments, and at low temperature (winter).

4. Conclusions

Cockroaches would approach and stay in the sharp corner for an uncertain time. Most of the cockroach movement in the simulated unstructured area was wall-following behavior. To overcome this issue, we proposed onboard obstacle avoidance to prevent the cockroach from being trapped in the corner areas, especially in the sharp corner. By implementing the feedback loop from the ToF laser range distance and IMU sensors, the cyborg insect could avoid the sharp corner area without stopping or becoming trapped in the corner areas. We have demonstrated that the cyborg insect successfully avoided the obstacle/wall, escaped from the sharp corner areas, and detected the human presence at a short distance. The backpack was augmented with embedded machine learning that could run on the 32-bit microcontroller to detect human presence in real-time. The chosen RF classifier with ten trees was applied as the embedded human detection classifier. The highest accuracy of the embedded human detection was achieved at 92.5% for 25 ± 10 cm, while the lowest accuracy was obtained at 70% at a distance of 100 ± 10 cm. These capabilities of intelligent human recognition augmented with onboard obstacle avoidance will potentially be implemented in SAR missions under rubble and unstructured environments.

In future research, two ToF laser range sensors will be mounted on the left and right sides of the wireless backpack. Adding the ToF sensor will enhance the obstacle avoidance algorithm to determine the suitable direction and avoid the obstacle efficiently. A higher-resolution thermopile array sensor will be applied to improve the accuracy of human detection, especially at longer distances. The object emitting heat will be collected for a more accurate human recognition algorithm.

Funding statement

This work was partly supported by JST (Moonshot R&D) grant number JPMJMS223A.

Data availability statement

Data will be made available on request.

Additional information

The images of custom small-size PCBs for the obstacle distance, thermal image sensors, and the proposed user interface (UI) are written in Supplementary Information.

Ethics declarations

The experimental procedures for cyborg insects in this study were approved by animal experiment committee at Osaka University (approval number, 2023-5-0).

CRediT authorship contribution statement

Mochammad Ariyanto: Writing – review & editing, Writing – original draft, Visualization, Software, Methodology, Formal analysis, Data curation, Conceptualization. Chowdhury Mohammad Masum Refat: Writing – review & editing, Writing – original draft, Methodology, Investigation, Data curation. Kotaro Yamamoto: Writing – review & editing, Writing – original draft, Methodology, Investigation. Keisuke Morishima: Writing – review & editing, Writing – original draft, Supervision, Project administration, Methodology, Funding acquisition, Conceptualization.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgment

We would like to thank Mr. Kazuyoshi Hirao for his assistance in designing the custom PCB for the utilized sensors. Authors thank Earth Corporation (Ritsu Ariyoshi) for providing the Madagascar hissing cockroaches in this study. This work was partly supported by JST (Moonshot R&D) grant number JPMJMS223A). The first author would like to thank Diponegoro University and the Ministry of Education and Culture, the Republic of Indonesia, for their support through the Doctoral Degree Scholarship Program (No. 466/UN7.P/HK/2020).

Footnotes

Appendix A

Supplementary data to this article can be found online at https://doi.org/10.1016/j.heliyon.2024.e26987.

Appendix A. Supplementary data

The following is the supplementary data to this article.

Multimedia component 1
mmc1.docx (1.5MB, docx)

References

  • 1.Ariyanto M., Refat C.M.M., Hirao K., Morishima K. Movement optimization for a cyborg cockroach in a bounded space incorporating machine learning. Cyborg Bionic Syst. Jan. 2023;4:12. doi: 10.34133/cbsystems.0012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Ariyanto M., Refat C.M.M., Zheng X., Hirao K., Wang Y., Morishima K. Teleoperated locomotion for biobot between Japan and Bangladesh. Computation. Oct. 2022;10(10) doi: 10.3390/computation10100179. Art. no. 10. [DOI] [Google Scholar]
  • 3.Dirafzoon A., Latif T., Gong F., Sichitiu M., Bozkurt A., Lobaton E. 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) Mar. 2017. Biobotic motion and behavior analysis in response to directional neurostimulation; pp. 2457–2461. [DOI] [Google Scholar]
  • 4.Dutta A. Cyborg insects could someday save your life. IEEE Pulse. May 2019;10(3):24–25. doi: 10.1109/MPULS.2019.2911818. [DOI] [PubMed] [Google Scholar]
  • 5.Kakei Y., et al. Integration of body-mounted ultrasoft organic solar cell on cyborg insects with intact mobility. Npj Flex. Electron. Sep. 2022;6(1) doi: 10.1038/s41528-022-00207-2. Art. no. 1. [DOI] [Google Scholar]
  • 6.Latif T., Whitmire E., Novak T., Bozkurt A. Sound localization sensors for search and rescue biobots. IEEE Sensor. J. May 2016;16(10):3444–3453. doi: 10.1109/JSEN.2015.2477443. [DOI] [Google Scholar]
  • 7.Latif T., Bozkurt A. Roach biobots: toward reliability and optimization of control. IEEE Pulse. Sep. 2017;8(5):27–30. doi: 10.1109/MPUL.2017.2729413. [DOI] [PubMed] [Google Scholar]
  • 8.Rasakatla S., Tenma W., Suzuki T., Indurkhya B., Mizuuchi I. CameraRoach: a WiFi- and camera-enabled cyborg cockroach for search and rescue. J. Robot. Mechatron. Feb. 2022;34(1):149–158. doi: 10.20965/jrm.2022.p0149. [DOI] [Google Scholar]
  • 9.Tran-Ngoc P.T., et al. Intelligent insect–computer hybrid robot: installing innate obstacle negotiation and onboard human detection onto cyborg insect. Adv. Intell. Syst. 2023;5(5) doi: 10.1002/aisy.202200319. [DOI] [Google Scholar]
  • 10.Whitmire E., Latif T., Bozkurt A. 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. EMBC); Jul. 2013. Kinect-based system for automated control of terrestrial insect biobots; pp. 1470–1473. [DOI] [PubMed] [Google Scholar]
  • 11.Li C., Wöhrl T., Lam H.K., Full R.J. Cockroaches use diverse strategies to self-right on the ground. J. Exp. Biol. Aug. 2019;222(15):jeb186080. doi: 10.1242/jeb.186080. [DOI] [PubMed] [Google Scholar]
  • 12.Daltorio K.A., et al. A model of exploration and goal-searching in the cockroach, Blaberus discoidalis. Adapt. Behav. Oct. 2013;21(5):404–420. doi: 10.1177/1059712313491615. [DOI] [Google Scholar]
  • 13.Wilson E.O., Nalepa C.A., Roth L.M., Bell W.J. Johns Hopkins University Press; Baltimore: 2007. Cockroaches: Ecology, Behavior, and Natural History. [DOI] [Google Scholar]
  • 14.Clark A.J., Triblehorn J.D. Mechanical properties of the cuticles of three cockroach species that differ in their wind-evoked escape behavior. PeerJ. Jul. 2014;2:e501. doi: 10.7717/peerj.501. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Crall J.D., et al. Social context modulates idiosyncrasy of behaviour in the gregarious cockroach Blaberus discoidalis. Anim. Behav. Jan. 2016;111:297–305. doi: 10.1016/j.anbehav.2015.10.032. [DOI] [Google Scholar]
  • 16.‘Madagascar Hissing Cockroaches . Jan. 03, 2017. Information and Care - Oklahoma State University’.https://extension.okstate.edu/fact-sheets/madagascar-hissing-cockroaches-information-and-care.html [Google Scholar]
  • 17.T. T. Vo-Doan, T. D. V., and H. Sato, ‘A Cyborg Insect Reveals a Function of a Muscle in Free Flight’, Cyborg Bionic Syst., Accessed: Jul. 12, 2023. [Online]. Available: https://spj.science.org/doi/10.34133/2022/9780504. [DOI] [PMC free article] [PubMed]
  • 18.Dutta A. 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER) Mar. 2019. Cyborgs: neuromuscular control of insects; pp. 682–685. [DOI] [Google Scholar]
  • 19.Li Y., Hu Y., Umezu S., Sato H. 2019 IEEE International Conference on Cyborg and Bionic Systems. CBS); Sep. 2019. Flying cyborg: a new approach for the study of Coleoptera's flight pitching; pp. 159–164. [DOI] [Google Scholar]
  • 20.Nguyen H.D., Sato H., Vo-Doan T.T. 2023 IEEE International Conference on Robotics and Automation (ICRA) May 2023. Burst stimulation for enhanced locomotion control of terrestrial cyborg insects; pp. 1170–1176. [DOI] [Google Scholar]
  • 21.Fu F., Li Y., Wang H., Li B., Sato H. The function of pitching in Beetle's flight revealed by insect-wearable backpack. Biosens. Bioelectron. Feb. 2022;198 doi: 10.1016/j.bios.2021.113818. [DOI] [PubMed] [Google Scholar]
  • 22.Dirafzoon A., Bozkurt A., Lobaton E. A framework for mapping with biobotic insect networks: from local to global maps. Robot. Autonom. Syst. Feb. 2017;88:79–96. doi: 10.1016/j.robot.2016.11.004. [DOI] [Google Scholar]
  • 23.Cole J., Mohammadzadeh F., Bollinger C., Latif T., Bozkurt A., Lobaton E. 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) Mar. 2017. A study on motion mode identification for cyborg roaches; pp. 2652–2656. [DOI] [Google Scholar]
  • 24.Refat C.M.M., Ariyanto M., Hirao K., Morishima K. 2022 IEEE International Conference on Mechatronics and Automation (ICMA) Aug. 2022. Experimental user interface and control system of teleoperated biobots explorer between Bangladesh and Japan; pp. 520–525. [DOI] [Google Scholar]
  • 25.Tsukuda Y., Tagami D., Sadasue M., Suzuki S., Lu J.-L., Ochiai Y. In: Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems. CHI E.A.’22, editor. Association for Computing Machinery; New York, NY, USA: Apr. 2022. Calmbots: exploring possibilities of multiple insects with on-hand devices and flexible controls as creation interfaces; pp. 1–13. [DOI] [Google Scholar]
  • 26.Rasakatla S., Suzuki T., Tenma W., Mizuuchi I., Indurkhya B. 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO) Dec. 2021. CameraRoach: various electronic backs packs for Search and Rescue; pp. 1300–1303. [DOI] [Google Scholar]
  • 27.Iyer V., Najafi A., James J., Fuller S., Gollakota S. Wireless steerable vision for live insects and insect-scale robots. Sci. Robot. Jul. 2020;5(44):eabb0839. doi: 10.1126/scirobotics.abb0839. [DOI] [PubMed] [Google Scholar]
  • 28.Whitmire E., Latif T., Bozkurt A. 2014 IEEE SENSORS. Nov. 2014. Acoustic sensors for biobotic search and rescue; pp. 2195–2198. [DOI] [Google Scholar]
  • 29.Chen Z., Wang Y., Liu H. Unobtrusive sensor-based occupancy facing direction detection and tracking using advanced machine learning algorithms. IEEE Sensor. J. Aug. 2018;18(15):6360–6368. doi: 10.1109/JSEN.2018.2844252. [DOI] [Google Scholar]
  • 30.Shubha B., Shastrimath V.V.D. Real-time occupancy detection system using low-resolution thermopile array sensor for indoor environment. IEEE Access. 2022;10:130981–130995. doi: 10.1109/ACCESS.2022.3229895. [DOI] [Google Scholar]
  • 31.Chidurala V., Li X. Occupancy estimation using thermal imaging sensors and machine learning algorithms. IEEE Sensor. J. Mar. 2021;21(6):8627–8638. doi: 10.1109/JSEN.2021.3049311. [DOI] [Google Scholar]
  • 32.Gochoo M., et al. Novel IoT-based privacy-preserving yoga posture recognition system using low-resolution infrared sensors and deep learning. IEEE Internet Things J. Aug. 2019;6(4):7192–7200. doi: 10.1109/JIOT.2019.2915095. [DOI] [Google Scholar]
  • 33.Faulkner N., Konings D., Alam F., Legg M., Demidenko S. Machine learning techniques for device-free localization using low-resolution thermopiles. IEEE Internet Things J. Oct. 2022;9(19) doi: 10.1109/JIOT.2022.3161646. 18681–18694. [DOI] [Google Scholar]
  • 34.Tan T.-H., Kuo T.-Y., Liu H. Intelligent lecturer tracking and capturing system based on face detection and wireless sensing technology. Sensors. 2019;19(19):4193. doi: 10.3390/s19194193. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Gochoo M., Tan T.-H., Batjargal T., Seredin O., Huang S.-C. 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) Oct. 2018. Device-free non-privacy invasive indoor human posture recognition using low-resolution infrared sensor-based wireless sensor networks and DCNN; pp. 2311–2316. [DOI] [Google Scholar]
  • 36.Yin C., Chen J., Miao X., Jiang H., Chen D. Device-free human activity recognition with low-resolution infrared array sensor using long short-term memory neural network. Sensors. Jan. 2021;21(10) doi: 10.3390/s21103551. Art. no. 10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Rezaei A., Stevens M.C., Argha A., Mascheroni A., Puiatti A., Lovell N.H. An unobtrusive human activity recognition system using low resolution thermal sensors, machine and deep learning. IEEE Trans. Biomed. Eng. Jan. 2023;70(1):115–124. doi: 10.1109/TBME.2022.3186313. [DOI] [PubMed] [Google Scholar]
  • 38.Muthukumar K.A., Bouazizi M., Ohtsuki T. A novel hybrid deep learning model for activity detection using wide-angle low-resolution infrared array sensor. IEEE Access. 2021;9:82563–82576. doi: 10.1109/ACCESS.2021.3084926. [DOI] [Google Scholar]
  • 39.Mashiyama S., Hong J., Ohtsuki T. 2015 IEEE International Conference on Communications (ICC) Jun. 2015. Activity recognition using low resolution infrared array sensor; pp. 495–500. [DOI] [Google Scholar]
  • 40.Phinyomark A., Phukpattaranont P., Limsakul C. Feature reduction and selection for EMG signal classification. Expert Syst. Appl. Jun. 2012;39(8):7420–7431. doi: 10.1016/j.eswa.2012.01.102. [DOI] [Google Scholar]
  • 41.Arozi M., Caesarendra W., Ariyanto M., Munadi M., Setiawan J.D., Glowacz A. Pattern recognition of single-channel sEMG signal using PCA and ANN method to classify nine hand movements. Symmetry. Apr. 2020;12(4) doi: 10.3390/sym12040541. Art. no. 4. [DOI] [Google Scholar]
  • 42.eloquentarduino, ‘Introducing MicroML’. Jul. 13, 2023. Accessed: Jul. 18, 2023. [Online]. Available https://github.com/eloquentarduino/micromlgen.

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Multimedia component 1
mmc1.docx (1.5MB, docx)

Data Availability Statement

Data will be made available on request.


Articles from Heliyon are provided here courtesy of Elsevier

RESOURCES