Significance
Artificial landmarks are widely used for autonomous navigation of robots and driverless cars. Landmarks can be simple traffic signs, radar retroreflectors, infrared lights, or barcodes depending on the sensors of the autonomous device. However, for in-air sonar sensors, one of the most applied sensors, artificial landmarks have never been used. Yet, in nature we find perfect examples for sonar landmarks. Bat-pollinated flowers guide and attract bats with acoustically conspicuous floral reflectors, detectable even in cluttered surroundings. We present how landmarks inspired by these floral forms can be used as guiding beacons and even be used as local source of information. Bioinspired landmarks can be very efficient tools, opening doors for new applications of sonar sensors and safer autonomous navigation.
Keywords: bioinspired sonar, robotics, sonar landmarks, echolocation, autonomous navigation
Abstract
Sonar sensors are universally applied in autonomous vehicles such as robots and driverless cars as they are inexpensive, energy-efficient, and provide accurate range measurements; however, they have some limitations. Their measurements can lead to ambiguous estimates and echo clutter can hamper target detection. In nature, echolocating bats experience similar problems when searching for food, especially if their food source is close to vegetation, as is the case for gleaning bats and nectar-feeding bats. However, nature has come up with solutions to overcome clutter problems and acoustically guide bats. Several bat-pollinated plants have evolved specially shaped floral parts that act as sonar reflectors, making the plants acoustically conspicuous. Here we show that artificial sonar beacons inspired by floral shapes streamline the navigation efficacy of sonar-guided robot systems. We developed floral-inspired reflector forms and demonstrate their functionality in 2 proof-of-principle experiments. First we show that the reflectors are easily recognized among dense clutter, and second we show that it is possible to discern different reflector shapes and use this identification to guide a robot through an unfamiliar environment. Bioinspired sonar reflectors could have a wide range of applications that could significantly advance sonar-guided systems.
Reliable and recognizable landmarks are a crucial prerequisite for autonomous navigation as they ensure stable and accurate positioning (1–3). A landmark can be either a prominent structure in the environment that can easily be detected and recognized or it can be an artificial landmark such as a signal or a sign installed as a local source of information. These artificial landmarks obviously must be suited to the respective modalities of the exteroceptive sensors on the autonomous vehicle. For visual sensors, objects can be recognized using salient cues (e.g., simple signs like colored cylinders; refs. 4 and 5). Such sensors can easily read out any traffic signs made for humans that use computer vision algorithms based on convolutional neural networks (6). Active infrared beacons are widely used for floor-cleaner robots (7). For laser sensors, retroreflective strips were successfully employed as beacons for underground mining vehicles (8) and active laser diodes were tested as beacons for planetary rovers and inspection vehicles at disaster sites (9). For radar sensors, multiple trihedral reflectors proved useful in autonomous straddle-carrier systems (10, 11) and could improve their localization efficacy. To aid underwater robot navigation, sonar landmarks (12), information encoding sonar markers (13, 14), and even commercially available sonar targets exist (15). Surprisingly, airborne sonar systems have not used synthetic markers so far, even though nearly every autonomous vehicle is equipped with sonar sensors and several studies showed that such sensors are capable of 3-dimensional (3D) localization and classification of objects in complex environments (16–21). The reason reflector designs used for underwater sonar cannot be used in air is that they are usually multilayered, which means that they have layers of different materials with different acoustic impedances creating a certain recognizable reflection pattern. In air the large difference between the acoustic impedances of air and solid material precludes the use of such reflector designs as all sound waves would be reflected by the first layer. However, it turns out nature found a simple yet efficient solution to the problem of airborne sonar landmarks. In tropical South and Central America, some 400 plant species chose a quite rare pollination system: They open and produce nectar at night and lure glossophagine bats for pollination services. In the absence of light they cannot attract their pollinators with conspicuous colors. Instead they found a smart solution: They developed floral forms which act as sonar signals and reflect the ultrasound calls of bats in special pathways, making the flowers acoustically conspicuous. This was shown for a row of bell-shaped flowers, like the calabash tree flower (Crescentia cujete; Fig. 1A), which has waxy petals that reflect a prolonged, high-amplitude echo (22). There are also more sophisticated examples of specialized acoustic beacons, including the guiding petals of Mucuna holtonii (23) (Fig. 1B) or the dish-shaped inflorescence-associated leaves of the Cuban vine Marcgravia evenia (24) (Fig. 1C). The dish-shaped leaves exhibit striking spatially invariant temporal and spectral reflective characteristics which are generated by an interference mechanism through the dish’s shape, transforming them into acoustic beacons that attract pollinating bats (24). Their key feature is their saliency even in highly cluttered surroundings, demonstrated by behavioral experiments with nectar-feeding bats. These experiments showed that such sonar signs can reduce bats’ search time for flowers hidden in clutter by 50% (24). As airborne sonar in man-made environments also has to deal with high amounts of clutter, the features of these natural reflectors can be used as a source of inspiration for the design of artificial sonar landmarks that can be reliably detected among other objects and surfaces. We hypothesized that forms similar to the Marcgravia reflectors can be detected among echo clutter with a technical sonar system and that they can additionally be used as navigational sonar markers for autonomous agents to streamline their navigation efficacy. To accomplish this task, we added functionality to the bioinspired reflectors and used them as an information-encoding source. As we know that bats accurately perceive these spherical forms and are capable of detecting small differences in form (25, 26), we proposed that sets of different reflectors with slightly different sizes and noticeable differences in their echo can be employed to encode different symbols, thereby transmitting useful information from the environment to the sensor system. These size differences will be clearly visible in the tempo/spectral echo signature (25) while their detectability in clutter is preserved. A sonar sensor that works with a broadband frequency modulated signal—just like that of a nectar-feeding bat—will be able to extract this metainformation from the environment and can be reliably guided through an unfamiliar environment only using sonar information.
Fig. 1.
Examples of bat-pollinated flowers, natural flower reflectors, and their spectral directional echo pattern. (A) Illustration of Pallas’s long-tongued bat (Glossophaga soricina) approaching a flower of the calabash tree (C. cujete). They have waxy petals (highlighted in brighter green). (B) Commissaris’s long-tongued bat (Glossophaga commissarisi) inspecting a ripe flower of the vine M. holtonii. Ripe, unvisited flowers of this plant show a concave-shaped petal (vexillum) that acts as acoustic guide (highlighted in brighter green). (C) Illustration of a Leach’s single leaf bat (Monophyllus redmani) visiting an M. evenia inflorescence. This plant has dish-shaped leaves (highlighted in brighter green) above their inflorescences that act as acoustic beacons (illustrations by Matthew Twombly). (D–F) Spectral directional echo pattern of the different flowers and flower reflectors highlighted in the illustrations above. (D) C. cujete flower. (E) Vexillum of M. holtonii. (F) Dish-shaped leaf of M. evenia. A–C: Image courtesy of Matthew Twombly/National Geographic Creative.
Results
We approached the development and application of bioinspired reflectors in 3 steps. First, we analyzed the geometry and acoustic properties of dish-shaped Marcgravia leaves (Fig. 2A and SI Appendix, Fig. S2) and tried to mimic their acoustic features with synthetic reflector dishes. In a second step, we tested if it was possible to detect these bioinspired reflector dishes among echo clutter using a broadband sonar system. We developed an algorithm using the envelope of the environment’s impulse response (IR) derived from a bat-like broadband call. For most experiments we used a linear frequency sweep from 160 kHz to 30 kHz. Calls of nectar-feeding bats are somewhat higher but have a comparable bandwidth (27–29). A broad bandwidth is not only important for bats to resolve structures in front of vegetation (30) but was also essential for our IR-based reflector recognition (SI Appendix, Fig. S8). Third, we introduced different reflector forms (sizes) and adjusted the algorithm to detect several reflector dishes at the same time. We tested the detection of multiple reflectors with an autonomously controlled linear guide and mobile robot and steered these systems through commands associated with the different reflector forms. We also developed a specialized sonar array that could quickly acquire sonar information over a large volume using a noncoherent electronic scan.
Fig. 2.
Echo features of a natural and a series of synthetic reflectors. From left to right: photograph of the reflector (the scale depicts 1 cm), spectral directional echo pattern (see Fig. 1 for target strength axis), directional pattern of the IRs, and an example of a single IR from a frontal ensonification angle. The gray inlay in the single IRs illustrates the nature of the edge of the reflector. The yellow arrows indicate the peaks originating from the edges, and the orange and red arrows indicate the 2 main peaks originating from the inner concave surface of the reflector. The highlighted parts in the directional IR plots indicate the area where the 2 main peaks (peak1–2) are clearly detectable. (A) Echo acoustic properties of a natural dish-shaped leaf of M. evenia. (B) Spherical reflector dish with a radius of 35 mm a depth of 30 mm and a flat edge. (C) Spherical reflector dish with a radius of 35 mm and a depth of 20 mm and rounded edges. (D) Spherical reflector dish with a radius of 35 mm and a depth of 25 mm and tapered edges. The natural and the bioinspired reflector are highlighted by a gray box. Note that the directional plots are asymmetric because they were measured with a monaural setup.
Design of Bioinspired Sonar Beacons.
Many bat-pollinated flowers are bell-shaped and have waxy petals, features that are thought to increase their detectability and help bats find the flowers. However, as with many other complex objects the spatiospectral features of the flower’s echo signal are quite stochastic (ref. 22 and Fig. 1D). For the acoustic guiding petal of M. holtonii, this is different; these petals show spatially more stable echo features (Fig. 1E), especially around 40°. At this point, there are high-amplitude full-spectrum reflections that should be very useful in helping a bat to coordinate its approach. However, the most striking and constant echo pattern, which is described as an echo signature, is found for the dish-shaped leaves of M. evenia (Fig. 1F). The leaves have 2 main morphological adaptations that give rise to this characteristic echo pattern. Unlike other leaves of this plant they have a spherical, concave shape and the leaf edge is bent backward (Fig. 2A, inlay in the IR plot and SI Appendix, Fig. S2). Both adaptations influence the acoustic properties; the bent leaf edges minimize reflections originating from the edge surface and the concave shape of the leaves results in an IR which is dominated by 2 steep amplitude peaks (peak1–2; see red and orange arrows in the IR plots of Fig. 2A), with almost constant time separation independent of the incidence angle (see highlighted areas in the directional IRs of Fig. 2A). Interference of these 2 reflections results in a spectral directional pattern of enhanced frequency bands separated by bands of frequency notches. We searched for a reflector shape that would respond like a dish-shaped Marcgravia leaf and mimic both features—the reduced edge reflection and most importantly the viewpoint-independent echo with 2 steep, clear amplitude peaks (peak1–2). The basic geometry of the leaves is a section of a sphere or a spherical dish. We produced such synthetic spherical dishes using 3D printing and analyzed the spatiospectral features of the reflector-induced echoes in ensonification measurements. We were particularly interested in distance of the first and second reflection (peak1–2) as this was a very constant echo feature for Marcgravia leaves as well as synthetic spherical reflectors and we wanted to use this peak1–2 to reliably detect reflectors in the environment. A full hemisphere on first sight seems optimal because it has a broad directionality for which the peak1–2 can be detected (Fig. 2B, spectral pattern and highlighted areas in directional IRs; see also SI Appendix, Fig. S6). However, peaks in the IR are not as clear because of multiple reflections within the concave surface which lead to a prolonged echo (Fig. 2B and inlay in SI Appendix, Figs. S5 and S7) which made it more difficult to detect peak1–2. By contrast, a very shallow dish has a short IR with clearly pronounced peaks, mainly consisting of peak1–2 (single IR of Fig. 2C) but limited directionality (see directional spectra and highlighted area in the directional IRs of Fig. 2C). We did a quantitative analysis of reflector-induced echoes for reflector dishes with different depths (SI Appendix, Figs. S6 and S7) and found that the design of an optimal reflector is a trade-off between clean peak pattern and broad directionality. The depth of an optimal reflector should not be lower than 60% of the radius as then the angular range in which peak1–2 can be detected would already be halved. On the other hand, it should also not exceed a depth of 70% because then the echo gets significantly longer and the peak pattern gets unclear. As we wanted to use this viewpoint-independent peak1–2 pattern as a recognizable echo feature we used reflectors with a depth between 60 and 70% (maximum 71.4%) of the radius throughout our experiments. These spherical dish reflectors also resembled the Marcgravia leaves—which have a depth exactly in that range (average 66% of the radius)—most accurately across spectral, temporal, and directional parameters, (compare Fig. 2 A and D and see SI Appendix, Supplementary Information Text and Figs. S1–S7 for more information on the reflector design). To further mimic the Marcgravia dish reflectors we also rounded off the edges of the synthetic reflectors, which caused lower peaks (Fig. 2B, IRs, yellow arrows, inlay), comparable to those of Marcgravia leaves. As we wanted to optimize the reflectors, avoid any other interfering amplitude peaks, and only generate the 2 characteristic peaks, we used tapered edges as they provided almost no reflective surface area (Fig. 2D, IRs, inlay). Note that from the start we chose synthetic reflectors with a bigger size than Marcgravia leaves to increase detection range. Therefore, the synthetic reflectors had a larger interfering wavelength, more spectral bands visible in the directional spectra, and larger spacing between peaks in the IRs. Smaller synthetic reflectors would have the same number of spectral bands and the same peak spacing as the original leaves but a lower detection range than the reflectors we used.
Reflector Detection in Clutter.
The proposed mechanism of reflector recognition by bats assumes that the spatially invariant echo features of the dish-shaped floral reflector contrasts with the vegetation echoes (23, 24). The latter are stochastic and highly variable and will change when bats change their position (31–33). An echolocating bat passing by a flowering Marcgravia plant receives the same echo information from the dish-shaped floral reflector again and again, whereas all of the other reflections coming from the surrounding vegetation change with every call due to the stochastic nature of the background vegetation. As bats integrate information acquired from multiple calls (34), the constant echo features of the reflector should accumulate in the acoustic image and show up among the multitude of changing clutter echoes. Although this mechanism would be straightforward for bats to exploit, it has never been proven to be effective through robotic experimentation or simulation studies. In a first experiment we tested whether it is possible to mimic this mechanism of reflector recognition with a technical sonar setup. We used a sonar head, consisting of a broadband electromechanical film (EMFi) transducer (35, 36) and a .25-inch condenser microphone which were fixed next to each other in an aluminum body. This sonar head was moved forward in 10-cm steps. For each position, the sonar head made scan movements and collected 51 echo sequences. We installed a reflector within artificial clutter (as described in ref. 24) on the right side of the movement path of the sonar head (Fig. 3A) and ensonified this scene with a broadband sweep and calculated the cross-correlation functions. In the directional plots of the cross-correlation functions the reflector pattern is clearly visible, but also many other reflections from the clutter screen and surrounding tables and tripods (Fig. 3 B–E). To extract the relevant reflector information and remove the noise, we then analyzed all amplitude peaks of the reflected signal and searched for the peak1–2 pattern, which is characteristic for this reflector (Fig. 3F). As we used a reflector with a radius of 32 mm we searched for a peak1–2 spacing of 78 µs, , where r is the radius and c is the speed of sound (25). We calculated the probability that the detected peak patterns would be the reflector’s pattern and plotted the probability in the respective position in a 3D voxel map (Fig. 3G; see SI Appendix for the detailed algorithm). Then, while moving the sonar head forward, more measurements were collected, always searching for the reflector-like peak pattern and updating the probability of the presence of a reflector in the voxel map (Fig. 3 G–J). To correct for uncertainty in positioning, we smoothed the voxel map in between each measurement by convolving it with a 2-dimensional (2D) Gaussian filter kernel. With every measurement, the maximum at the reflector position in the voxel map grew in importance as it remained consistently at the same position. False positives would occur whenever peak patterns similar to the reflector peak pattern were recognized (Fig. 2I), but, as they occurred only for some measurements, they rapidly died down again. Therefore, only the peak pattern of the spherical reflector was received repeatedly. After 3 to 5 measurements—depending on the distance and the clutter present—the true maximum in the voxel map became clear and reflector position could be determined. This demonstrates that bioinspired sonar reflectors can indeed be recognized even within acoustically complex scenarios and it shows that the perceptual mechanism proposed for bats, that is, to build up acoustic images by summing up peaks in the cross-correlation functions of the received echo (37), is functional and can be simulated with a technical sonar setup.
Fig. 3.
Reflector recognition with a biomimetic sonar head in cluttered surroundings. (A) Sketch of the experimental setup. The sonar head was moved stepwise (10 cm) forward along the red line in the direction to the clutter screen. At each position, marked with a red dot, the sonar head was scanning the environment −90° to 90° degrees, in 3.6° steps amounting to 51 single measurements during one scan. (B–E) Directional IR pattern acquired from the scans at different distances from the clutter screen: (B) 110 cm, (C) 80 cm, (D) 50 cm, and (E) 20 cm. The peaks in the IRs are plotted in white, and the red arrow indicates the start of the peaks origination from the reflector. The blue line in D shows the position of the IR that is depicted in F. (F) Single IR exemplarily shown for a distance of 50 cm from the reflector. The red arrow here also indicates the start of the peaks origination from the reflector surface. The red line depicts the peak detection threshold for which the peak positions were determined. (G–J) Three-dimensional representation of the voxel-based range maps that are based on the probability of the presence of a certain peak separation, which is characteristic for the reflector. With more and more measurements the probability of reflector presence gets higher and pointier, which makes the estimation of reflector position more precise. Note that there are also some false assignments (small peaks) visible in graph I. The gray line indicates the path of the sonar head.
Robot Guidance.
To quickly acquire sonar information over a large area and search for the different reflectors without any scanning movement we developed a sonar array with 2 omnidirectional microphones and 14 directional and broadband EMFi transducer (see SI Appendix, Figs. S9–S11 and description in SI Appendix, Supplementary Information Text). With this setup, it was possible to perform an electronic scan, instantaneously build up a sonar-based voxel map of the environment, and search for sonar reflectors. We equipped a robot (Pioneer P3-DX; Fig. 4 D and E top image inlay) with the sonar array and guided it only based on reflector information. For capturing ground-truth data, the robot was also equipped with a 2D laser scanner (Hokuyo URG), which was used to build a map of the surroundings. The map was constructed using the FastSLAM 2.0 algorithm implemented in a Robot Operating System (ROS) software framework. In subsequent experimentation, the robot was localized in the generated map using adaptive Monte Carlo localization, also implemented in ROS (Fig. 4 D and E). This map was only used for ground-truth capture, that is, the robot’s control algorithm had no access to this information. We repeated the clutter detection experiment, but this time the autonomous robot had to search for a reflector within clutter and, once found, turn toward it (see trajectories in Fig. 4D and Movie S1). The reflector detection algorithm performed well even though the information from the sonar array was sparser—14 measurements per scan compared to 51 measurements with the scanning/moving sonar head. However, for reliable reflector detection movement and integration of several measurements from different distances and directions remained crucial.
Fig. 4.
Robot guidance experiment with a set of bioinspired sonar reflectors and a specialized sonar array. The robot was guided with different sizes of reflectors, which transmitted different commands encoded by their characteristic peaks in their IR. The peaks are indicated by orange and red arrows, and the green bar indicates the characteristic peak1–2 spacing. (A) IR of a green reflector (r = 35 mm, depth = 25 mm: light on). (B) IR of a blue reflector (r = 50 mm, depth = 30 mm; turn 30° away from the reflector). (C) IR of an orange reflector (r = 70 mm, depth = 49 mm; stop at 20-cm distance). (D) Image and laser-generated map for the reflector search experiment. The reflector was placed within a clutter screen while the robot had to find it and turn toward it. The trajectories of 4 different runs are depicted in the laser range map. In the map the positions of the reflectors are indicated by the green, blue, and orange shapes. For the blue trajectory the reflector and clutter screen (also in blue) was placed on the left side. (E) Image and laser-generated map for the guidance experiment. The trajectories of 3 runs are depicted in the laser range map in different colors. (F–I) Examples of voxel maps during the approach of the 4 different reflectors as indicated in B. The light crosses indicate the actual position of the reflectors. The red line gives the path of the robot; as the turning of the robot was not incorporated into the voxel maps, this line appears to be straight.
Information-Encoding Reflectors and Simultaneous Reflector Detection.
In a second experiment we tested whether it is possible to detect different sonar reflectors presented simultaneously and associate specific motion commands with reflectors of different sizes. The different sizes of the reflectors resulted in slightly different reflection pathways, which shifted the 2 main amplitude peaks further apart; nevertheless, the peak1–2 distance remained constant for a given reflector over different angles of sound incidence. We used 3 different sizes of reflectors with radii of 35 mm, 50 mm, and 70 mm, which had peak1–2 spacings of 82 µs, 122 µs, and 171 µs, respectively (Fig. 4 A–C). The reflector-recognition algorithm could then be tuned to the peak pattern of different reflector forms, which makes it possible to recognize multiple reflectors within the surroundings. We used the reflector shapes as a kind of traffic sign and guided the robot through a laboratory environment. A green reflector (r = 35 mm) transmitted the command “turn light on,” and as soon as it was detected a light-emitting diode on top of the array was switched on. The blue reflector (r = 50 mm) transmitted the command “turn 30° away,” which initiated a movement to the opposite direction of the reflector. The orange reflector (r = 70 mm) caused the robot to stop at a certain distance from the reflector. Note that the different colors were only used to help us to distinguish the different reflectors and were not used for visual guidance. As an example, we plotted 3 voxel-based range maps for each reflector to show how the range maps built up while the robot moved forward through the laboratory (Fig. 4 F–I). As soon as the reflector was detected a small peak appeared in the image, which became more prominent with more and more measurements from different distances and positions. The positioning of the reflectors was accurate (compare light crosses and green and blue voxels in Fig. 4 F and G) but became a little bit off due to the odometry error after turns and longer travel distances (Fig. 4 H and I). However, the identified landmarks could be used in the classic landmark-based simultaneous localization and mapping (SLAM) formulation to reduce the buildup of positioning errors due to odometry (38). We also did a quantitative assessment of our system with a linear guide steered by the set of reflectors described above and measured how many errors were made during runs with 3 different speeds (25 runs per speed). We found that from a speed of 19 mm/s on the system ran without any errors (SI Appendix). In general, the speeds were not very high; however, the system could be easily sped up if computation of the range maps would be streamlined and coded sonar sequences were used, which would allow the simultaneous operation of all 14 loudspeakers (39).
Discussion
Our experiments show that sonar beacons inspired by natural flower reflectors can indeed be detected in cluttered surroundings using the envelope of the environment’s IR based on bat-like broadband calls. Such a representation is biologically feasible to generate from a cochlear representation followed by a dechirping operation [which equates to a semicoherent matched filter operation (40)]. We also demonstrate that an autonomous system can be reliably guided with a set of bioinspired reflectors, a specialized sonar array mimicking the scanning movements of bats, and an algorithm tuned to different reflector shapes. For the whole system/algorithm, movement is crucial as it causes the ensonification angles of the objects in the surroundings to change, leading to a changed peak pattern for all objects except for the reflectors. This makes the reflector detection process very selective, as shown by the fact that no ambiguous peaks are visible in the range maps (Fig. 4 F–I), although there were walls, tripods, and tables in the surroundings. Only the reflectors show up because their peak pattern remains constant and can be detected repeatedly in the IRs of the environment. Comparing the reflector designs proposed here with the underwater reflector designs referred to above, we note again that for airborne sonar we cannot make use of multiple internal layers of materials with different reflective properties to generate a characteristic echo pattern (13, 15) because of the large mismatch between acoustic impedances of air and solid materials. Virtually no sound will penetrate into the reflector, and thus the approach of layers of different reflective properties cannot be used for sonar landmarks on land. Second, none of the underwater studies show that their sonar landmarks can be perceived in clutter-rich surroundings, as they are always used in open water or on the seabed (12, 14). However, in our approach where we take inspiration from nature in designing sonar landmarks, we provide experimental evidence that passive in-air sonar landmarks can indeed be perceived in dense clutter. Possible applications of our proposed method can be found in level-4 autonomy: By augmenting the environment with our beacons we can implement an infrastructure-to-vehicle (I2V) communication, which can alleviate applications such as automated parking, steering of service robots, or guidance of straddle-carrier systems and robots in mines. Sonar sensors communicating with bioinspired sonar landmarks could work as important backup systems, which would perfectly support visual sensors as they also work under visually challenging conditions such as in dark, glass-rich, or foggy and dusty environments. Next-generation reflectors could also be asymmetrical, mimicking even closer the form of original Marcgravia leaves. Those would have a slightly changing yet predictable echo pattern which would allow the estimation of orientation of the mobile agent with respect to the reflector, making it easier to use the reflectors as waypoints. Furthermore, by transferring these ideas of reflector construction and detection to electromagnetism, one can construct I2V communication for radar systems, which could be useful in autonomous driving as well as autonomous shipping.
Materials and Methods
Leaf and Reflector Ensonification.
The echoes of the original flowers and leaves (Fig. 1 D and E) were measured during an earlier project. We ensonified leaves with a sonar head from a distance of 20 cm with a continuously replayed MLS (maximum length sequence) signal of 16,383-samples length. We recorded the reflected signal and obtained the IRs by deconvolution of the reflected echo and the original MLS. The leaves were measured from 91 different angular positions on the azimuthal plane around their opening in steps of 2°. We conducted the measurements of the bioinspired reflectors in a similar way, but we used a sweep signal (180 kHz to 30 kHz, 1 ms) instead of an MLS. We selected the sweep signal because there were some slight differences in the waveform of the IRs, that is, the peaks where higher and signal-to-noise ratio was better, making it favorable to use the cross-correlation of the echo and sweep. Therefore, broadband sweeps were used for the measurements and characterization of the reflected signal, as well as the reflector detection experiments. The ensonification of the reflectors was done with an automated setup, where the reflectors were turned with a stepping motor in 1.8° steps amounting to 101 measurements. The spectral directional echo patterns in Figs. 1 and 2 were obtained by windowing the IRs (1,024 samples) and calculating the power spectral density (PSD). To obtain the spectral target strength independent of the frequency response of the loudspeaker, we calculated the difference between PSD from the reflector and the PSD of an acrylic glass plate oriented perpendicular to the direction of sound propagation at exactly the same position as the reflector. This was also done in a similar way for the measurements of the original flower reflectors.
Clutter Detection Experiment.
To figure out whether the bioinspired reflectors could be recognized in clutter-rich surroundings, we used a custom-built sonar head consisted of a .25-inch condenser microphone (40BF, preamplifier 26AB, power module 12AA; G.R.A.S. Sound & Vibration) and a one-layered custom-made EMFi loudspeaker (see SI Appendix, Fig. S9 for directionality and frequency response). The speaker and the microphone were embedded in an aluminum body, which was mounted on a stepping motor, which itself was mounted on a tripod. We started the first measurement at a 130-cm distance to the reflector and moved the tripod manually forward in steps of 10 cm. For each position the sonar head scanned the surrounding by a rotation of 180° in 3.6° steps, which summed up to 51 measurements. The artificial clutter consisted of 126 randomly inclined round plastic plates with a diameter of 38 mm. We ensonified this scene with a broadband 5-ms, downward modulated sweep (160 to 30 kHz) and calculated the cross-correlation function of the recorded sequence. The reflector detection algorithm was the same we used for the array signal processing (discussed in the following paragraph).
Array Signal Processing and Voxel-Map Generation.
To be able to quickly acquire sonar information for different directions without any rotational movement we developed an array of speakers which were combined with 2 omnidirectional microphones (.25-inch free-field microphone 40BF, preamplifier 26AB, power module 12AA; G.R.A.S. Sound & Vibration). The sensor array consisted of 14 ultrasonic loudspeakers constructed around double-layered EMFi transducer materials (see SI Appendix, Fig. S9 for directionality and frequency response). Each transducer was oriented in a different direction and they were inclined by 5°, amounting to an angle of 70°, as can be seen in SI Appendix, Fig. S10. As signal we used a linear frequency sweep from 160 kHz to 30 kHz in 6 ms. The 2 microphones each recorded an acoustic time-pressure signal with a duration of 20 ms. See SI Appendix for more information on array signal processing and the detailed algorithm for the voxel-map generation.
The Robot System.
The robot system consisted of an Intel NUC, which ran a custom LabView program to process the data, implementing the algorithm detailed in the SI Appendix. The NIDAQ NI-USB 6361 generated the ultrasonic signal which was sent to an amplifier (Ultrachecker, custom-built by Michael Günther, Chair of Sensor Technology, Erlangen University, Germany) generating a signal of 200-V amplitude. This signal was fed to a custom-made high-voltage demultiplexer which could route the amplified signal to 1 of the 14 emitter channels. The routing was controlled by the Intel NUC through the digital outputs of the NI DAQ device (SI Appendix, Fig. S11).
The reflected sound was captured by 2 G.R.A.S. microphones (40BF with preamplifier 26AB; G.R.A.S. Sound & Vibration) which were amplified by a low-noise amplifier (power module 12AA; G.R.A.S. Sound & Vibration). The amplified signals were digitized by the NIDAQ device at a sample rate of 500 kHz. The robot consisted of a Pioneer 3DX experimental robot platform. It was a nonholonomic robot with a front-wheel skid steer drivetrain. The robot was running the ROS. Ground-truth data about the robot’s path were captured through the combination of the wheel encoders and the laser scanner (Hokuyo UBG-04LX-F01) in a SLAM application. We used the GMapping module available in the ROS software framework.
Data and Code Availability.
Data and code described in the paper are publicly available at Zenodo.
Supplementary Material
Acknowledgments
We thank Wouter Halfwerk, Susan McGrath, Estefania Velilla, and Andrew Cronin for valuable suggestions on the manuscript; Reinhard Lerch and Peter Ploss for stimulating discussions; Manuel Weiß for programming support; and Christian Hoffmann, Tayyab Saeed, Shu Ju, Girmi Schouten, and Dennis Laurijssen for help during several parts of the experiments. This work was supported by Volkswagen Foundation grant Az. 89 111.
Footnotes
The authors declare no competing interest.
This article is a PNAS Direct Submission.
Data deposition: Data and code described in the paper are publicly available at https://github.com/GitRaSimon/BioSoR.
This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.1909890117/-/DCSupplemental.
References
- 1.Siciliano B., Khatib O., Springer Handbook of Robotics (Springer, 2016). [Google Scholar]
- 2.Thrun S., “Finding landmarks for mobile robot navigation” in Proceedings International Conference on Robotics and Automation (IEEE, Piscataway, NJ, 1998), pp. 958–963. [Google Scholar]
- 3.Magnago V., et al. , “Optimal landmark placement for indoor positioning using context information and multi-sensor data” 2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) (IEEE, Piscataway, NJ, 2018), pp. 1–6. [Google Scholar]
- 4.Prasser D., Wyeth G., “Probabilistic visual recognition of artificial landmarks for simultaneous localization and mapping” IEEE International Conference on Robotics and Automation (IEEE, Piscataway, NJ, 2003), pp. 1291–1296. [Google Scholar]
- 5.Kim D., Lee D., Myung H., Choi H.-T., Artificial landmark-based underwater localization for AUVs using weighted template matching. Intell. Serv. Robot. 7, 175–184 (2014). [Google Scholar]
- 6.Cireşan D., Meier U., Masci J., Schmidhuber J., Multi-column deep neural network for traffic sign classification. Neural Netw. 32, 333–338 (2012). [DOI] [PubMed] [Google Scholar]
- 7.Gutmann J.-S., Culp K., Munich M. E., Pirjanian P., “The social impact of a systematic floor cleaner” 2012 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO) (IEEE, Piscataway, NJ, 2012), pp. 50–53. [Google Scholar]
- 8.Scheding S., Dissanayake G., Nebot E. M., Durrant-Whyte H., An experiment in autonomous navigation of an underground mining vehicle. IEEE Trans. Robot. Autom. 15, 85–95 (1999). [Google Scholar]
- 9.Browne A. F., Vutetakis D., “Initial implementation of a novel laser localization system” 2016 IEEE/ION Position, Location and Navigation Symposium (PLANS) (IEEE, Piscataway, NJ, 2016), pp. 938–941. [Google Scholar]
- 10.Durrant-Whyte H., An autonomous guided vehicle for cargo handling applications. Int. J. Robot. Res. 15, 407–440 (1996). [Google Scholar]
- 11.Durrant-Whyte H., Pagac D., Rogers B., Stevens M., Nelmes G., Field and service applications-an autonomous straddle carrier for movement of shipping containers-from research to operational autonomous systems. IEEE Robot. Autom. Mag. 14, 14–23 (2007). [Google Scholar]
- 12.Brignone L., Perrier M., Viala C., “A fully autonomous docking strategy for intervention AUVs” OCEANS 2007-Europe (IEEE, Piscataway, NJ, 2007), pp. 1–6. [Google Scholar]
- 13.Satish A., Nichols B., Trivett D., Sabra K. G., Passive underwater acoustic markers for navigation and information encoding for high frequency sound navigation and ranging (SONAR) devices. J. Acoust. Soc. Am. 142, 2731 (2017). [Google Scholar]
- 14.Srivastava P., Nichols B., Sabra K. G., Passive underwater acoustic markers using Bragg backscattering. J. Acoust. Soc. Am. 142, EL573–EL578 (2017). [DOI] [PubMed] [Google Scholar]
- 15.Islas-Cital A., Atkins P., Gardner S., Tiltman C., Performance of an enhanced passive sonar reflector SonarBell: A practical technology for underwater positioning. Underwat. Technol. 31, 113–122 (2013). [Google Scholar]
- 16.Barshan B., Kuc R., Differentiating sonar reflections from corners and planes by employing an intelligent sensor. IEEE Trans. Pattern Anal. Mach. Intell. 12, 560–569 (1990). [Google Scholar]
- 17.Steckel J., Boen A., Peremans H., Broadband 3-D sonar system using a sparse array for indoor navigation. IEEE Trans. Robot. 29, 161–171 (2013). [Google Scholar]
- 18.Steckel J., Peremans H., Acoustic flow-based control of a mobile platform using a 3D sonar sensor. IEEE Sens. J. 17, 3131–3141 (2017). [Google Scholar]
- 19.Kleeman L., Kuc R., Mobile robot sonar for target localization and classification. Int. J. Robot. Res. 14, 295–318 (1995). [Google Scholar]
- 20.Eliakim I., Cohen Z., Kosa G., Yovel Y., A fully autonomous terrestrial bat-like acoustic robot. PLOS Comput. Biol. 14, e1006406 (2018). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kroh P. K., Simon R., Rupitsch S. J., Classification of sonar targets in air: A neural network approach. Sensors (Basel) 19, E1176 (2019). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Helversen Dv Dv., Holderied M. W., Helversen Ov Ov., Echoes of bat-pollinated bell-shaped flowers: Conspicuous for nectar-feeding bats? J. Exp. Biol. 206, 1025–1034 (2003). [DOI] [PubMed] [Google Scholar]
- 23.von Helversen D., von Helversen O., Acoustic guide in bat-pollinated flower. Nature 398, 759–760 (1999). [Google Scholar]
- 24.Simon R., Holderied M. W., Koch C. U., von Helversen O., Floral acoustics: Conspicuous echoes of a dish-shaped leaf attract bat pollinators. Science 333, 631–633 (2011). [DOI] [PubMed] [Google Scholar]
- 25.Simon R., Holderied M. W., von Helversen O., Size discrimination of hollow hemispheres by echolocation in a nectar feeding bat. J. Exp. Biol. 209, 3599–3609 (2006). [DOI] [PubMed] [Google Scholar]
- 26.von Helversen D., Object classification by echolocation in nectar feeding bats: Size-independent generalization of shape. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 190, 515–521 (2004). [DOI] [PubMed] [Google Scholar]
- 27.Goerlitz H. R., Geberl C., Wiegrebe L., Sonar detection of jittering real targets in a free-flying bat. J. Acoust. Soc. Am. 128, 1467–1475 (2010). [DOI] [PubMed] [Google Scholar]
- 28.Knornschild M., Glockner V., Von Helversen O., The vocal repertoire of two sympatric species of nectar-feeding bats (Glossophaga soricina and G. commissarisi). Acta Chiropt. 12, 205–215 (2010). [Google Scholar]
- 29.Simon R., et al. , Biosonar resolving power: Echo-acoustic perception of surface structures in the submillimeter range. Front. Physiol. 5, 64 (2014). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Siemers B. M., Schnitzler H. U., Echolocation signals reflect niche differentiation in five sympatric congeneric bat species. Nature 429, 657–661 (2004). [DOI] [PubMed] [Google Scholar]
- 31.von Helversen D., von Helversen O., Object recognition by echolocation: A nectar-feeding bat exploiting the flowers of a rain forest vine. J. Comp. Physiol. A Neuroethol. Sens. Neural Behav. Physiol. 189, 327–336 (2003). [DOI] [PubMed] [Google Scholar]
- 32.Yovel Y., Franz M. O., Stilz P., Schnitzler H. U., Plant classification from bat-like echolocation signals. PLOS Comput. Biol. 4, e1000032 (2008). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Yovel Y., Stilz P., Franz M. O., Boonman A., Schnitzler H. U., What a plant sounds like: The statistics of vegetation echoes as received by echolocating bats. PLOS Comput. Biol. 5, e1000429 (2009). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Moss C. F., Surlykke A., Auditory scene analysis by echolocation in bats. J. Acoust. Soc. Am. 110, 2207–2226 (2001). [DOI] [PubMed] [Google Scholar]
- 35.Rupitsch S. J., Lerch R., Strobel J., Streicher A., Ultrasound transducers based on ferroelectret materials. IEEE Trans. Dielectr. Electr. Insul. 18, 69–80 (2011). [Google Scholar]
- 36.Rupitsch S. J., Piezoelectric Sensors and Actuators: Fundamentals and Applications (Springer-Verlag, Berlin, 2018). [Google Scholar]
- 37.Simmons J. A., A view of the world through the bat’s ear: The formation of acoustic images in echolocation. Cognition 33, 155–199 (1989). [DOI] [PubMed] [Google Scholar]
- 38.Thrun S., Burgard W., Fox D., Probabilistic Robotics (Intelligent Robotics and Autonomous Agents, The MIT Press, 2005). [Google Scholar]
- 39.Steckel J., Sonar system combining an emitter array with a sparse receiver array for air-coupled applications. IEEE Sens. J. 15, 3446–3452 (2015). [Google Scholar]
- 40.Wiegrebe L., An autocorrelation model of bat sonar. Biol. Cybern. 98, 587–595 (2008). [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Data and code described in the paper are publicly available at Zenodo.