Skip to main content
Microsystems & Nanoengineering logoLink to Microsystems & Nanoengineering
. 2026 Mar 11;12:91. doi: 10.1038/s41378-026-01190-8

Complementary visual localization and tactile mapping approach for robotic perception of millimeter-sized objects with irregular surfaces

Jaehwan Jang 1,#, Byeong-Sun Park 2,#, Kyeong Taek Oh 1, Seong-Jae Yoo 1, Seong-Min Im 1, Yasser Khan 3,4, Min-gu Kim 1,5,
PMCID: PMC12979585  PMID: 41813644

Abstract

Humanoid robots and human-machine interaction technologies are essential for perceiving and manipulating millimeter-scale objects with irregular surfaces in extreme environments, such as outer space, radioactive zones, and hazardous sites with explosive ordnance, where human access is restricted. A vision-based perception approach provides spatial and positional information about objects but relying solely on it for robot manipulation poses challenges due to limitations in detectable object size, as well as sensitivity to external factors such as focusing issues, occlusion, and lighting conditions. In contrast, tactile perception offers valuable information about aspects that are difficult to discern visually, including an object’s shape, surface characteristics, and the forces involved during contact. This study presents a complementary visual localization and tactile mapping framework that allows robots to effectively perceive small objects with irregular surfaces in visually restricted environments. The proposed method draws inspiration from the sequential vision-tactile sensory processing observed in humans when handling small objects with irregular surfaces. It employs an RGB-Depth camera for visual perception and a soft pressure sensor array, made using inkjet printing, for tactile perception. We demonstrate the feasibility of implementing a sensory substitution to detect the size and location of objects through visual perception, as well as identify object surfaces and reconstruct their three-dimensional profiles using tactile scanning, particularly in environments where visual information is limited. This study provides a technological foundation for enhancing the autonomy and adaptability of humanoid robots in unpredictable and unstructured environments, particularly to support precise robot manipulation in such conditions.

graphic file with name 41378_2026_1190_Figa_HTML.jpg

Subject terms: Electrical and electronic engineering, Structural properties


Advancements in humanoid robotics and human–machine interaction have revolutionized dexterity and adaptability, enabling human-like performance13. Specifically, humanoid robots have attracted significant attention in extreme environments where human access is restricted, including space exploration, nuclear accident response, and explosive ordnance disposal46. Such robots are designed to perceive and explore their surroundings like humans and perform complex tasks such as tightening screws and pressing buttons. To achieve this, recent research in robotics increasingly draw on human sensory modalities such as vision and tactile710. By leveraging such sensory information, robots can achieve a comprehensive understanding of their environment, accurately perceive the properties of objects, and ultimately perform precise manipulation tasks. To ensure reliable perception in unstructured and unpredictable environments, it is essential for robots to use visual and tactile information in a mutually complementary manner that can substitute for one another depending on the situation, enabling robust and precise robotic manipulation.

Traditionally, humanoid robot technologies have predominantly relied on vision-based approaches11,12. Advances in deep learning and three-dimensional (3D) data processing have facilitated vision-based object recognition and segmentation, leading to the widespread adoption of cameras and other optical devices as primary tools in robot systems13,14. Vision-based technologies employ various sensor configurations, including red–green–blue (RGB), RGB-Depth, and light detection and ranging (LiDAR), to accurately capture object shape and position, offering significant advantages in shape analysis and recognition1517. However, these technologies are susceptible to environmental factors such as focusing issues, occlusion, and lighting conditions, which can hinder precise object detection and pattern analysis18,19. These limitations can arise suddenly, especially in unstructured environments. As a result, it becomes difficult to recognize small objects with complex structural patterns or respond immediately to changes, thereby limiting the effectiveness of these robots in precision tasks for micromanipulation

Touch is a primary sense that humans use to compensate for limited visual information or to obtain information that cannot be acquired through vision alone, making it highly suitable for application in humanoid robots. For example, tactile sensing can provide information regarding object shape or protrusions, in addition to visual information2024. To meet these demands, tactile sensors should provide a wide sensing range, high spatial resolution, and resistance to mechanical deformation artifacts. However, many existing flexible tactile sensors do not yet fully satisfy these requirements, which poses challenges in tasks that demand fine tactile perception, particularly in distinguishing the geometric and structural details of millimeter-scale objects with irregular surfaces2528. Under such performance limitations, humanoid robots equipped with these tactile sensors often obtain only limited information, such as simple contact detection or pressure localization, when interacting with objects29,30. As a result, it becomes difficult to acquire fine detailed shape information, including local geometry, microstructures, and small protrusions, which is essential for precise perception. Moreover, tactile sensing alone has inherent limitations in unstructured and unpredictable environments. Although tactile sensing is sensitive to surface properties, it is relatively limited in identifying a target’s global position, orientation, and the overall structure of the surrounding environment. Therefore, precise perception requires additional sensory support, such as vision, and more importantly, coordination between senses that can compensate for one another when visual information becomes restricted or incomplete. Nevertheless, current research still tends to focus primarily on single sensory modalities, with limited exploration into complementary multimodal strategies that can compensate or substitute when one modality is constrained3138.

To overcome these limitations, this study proposes a complementary visual localization and tactile mapping approach, enabling robots to effectively perceive small objects with irregular surfaces, particularly in environments where visual information is limited. This approach mimics the human sequential visual–tactile perception process. Specifically, a single RGB-Depth camera, which imitates human vision, is used to recognize the spatial and positional information of objects. When visual information is restricted due to environmental factors, a soft tactile sensor array, fabricated using inkjet printing techniques with soft polymer-based materials, is used to detect fine shape and surface information of the object, thereby complementing or substituting the visual input (Fig. 1). In addition, by employing a tactile scanning-based fine surface reconstruction technique, this study provides a technical foundation for extending the sensory capabilities of humanoid robots to support precise robot manipulation.

Fig. 1. Complementary visual localization and tactile mapping approach for robotic manipulation of millimeter-sized objects with irregular surfaces.

Fig. 1

Conceptual illustration of complementary visual localization and tactile mapping framework, inspired by human sensory complementation process to enable cooperative vision–tactile sensing for micromanipulation

Results and Discussion

Tactile Perception for Millimeter-Sized Objects with Irregular Surfaces

Figure 2a presents a schematic illustrating the deposition of deformable metallic materials using the inkjet printing technique employed in this study. Inkjet printing has emerged as a promising fabrication method for electronic devices, offering several advantages over conventional electrode-patterning techniques3941. In contrast to traditional methods, this approach enables rapid, high-resolution electronic device patterning on large-area substrates without the need for cleanroom conditions or masking. Additionally, inkjet printing allows the deposition of soft, flexible polymer metal composite materials, while offering high design-fabrication adaptability, scalability to large areas, and the ability to tune electrode thickness and sensing performance through repeated additive electrode deposition (Table S1). The fabrication process is described in detail in the Methods section. Figure 2b demonstrates that the inkjet printing process, utilizing additive printing, enables the deposition of metallic materials with a thickness of up to 20% of the designed line width. Using additive printing technique to deposit thick metallic layers reduces the electrical resistance of the electronic device (Figures S1 and S2). Furthermore, the results presented in Figures S3 and S4 indicate that additive printing lowers the electrode line resistance and mitigates leakage currents and signal interference, thereby enhancing overall device performance.

Fig. 2. Inkjet printing process and functional evaluation of deformable metallic materials for tactile sensing applications.

Fig. 2

a Schematic illustration of the deposition process for deformable metallic materials using inkjet printing. b Cross-sectional profile of the printed metallic traces. c Electrical resistance variations under applied pressure depending on the substrate type. d Deformation and height change of the electrode structure under vertical loading

The functionality of deformable metallic materials and substrates was evaluated by fabricating a deformable Ag trace of width 0.5 mm and length 40 mm. Figure 2c presents the resistance variations measured under applied pressure as a function of the substrate type. When deformable metallic material is deposited on a rigid substrate, the electrical properties exhibit minimal changes under external pressure. In contrast, when fabricated on a soft elastomeric substrate such as styrene–ethylene–butylene–styrene (SEBS), the electrical properties undergo significant variations due to the applied pressure and tensile stress induced by the mechanical deformation of the substrate. A 2 × 2 mm electrode was fabricated on the SEBS substrate to assess electrode deformation under applied pressure using both a deformable electrode material and substrate (Fig. 2d). Additionally, the height change of the electrode under vertical loading was measured, confirming that the resistance variations observed in previous experiments were from the deformation of the electrode’s cross-sectional area under applied mechanical force. Notably, this deformable electrode structure increased the cross-sectional area under vertical pressure, a crucial characteristic in capacitor sensor fabrication. When incorporated into tactile sensors, this property enhances sensor sensitivity under identical pressure conditions. Figure 3a presents the structure of the fabricated capacitive pressure sensor. The pressure sensor was designed to replicate the role of Merkel receptors, a type of human tactile mechanoreceptor specialized in detecting pressure and subtle surface details4244. In this structure, two parallel electrodes are separated by a dielectric layer. The capacitance of the sensor is determined by the permittivity of the free space between the parallel plates, the relative permittivity of the dielectric, the distance between the electrodes, and the overlapping area of each electrode. When an external force compresses the sensor, the distance between the electrodes decreases, resulting in an increase in capacitance, thereby enabling tactile sensing. Recent studies have investigated the application of two-dimensional (2D) interdigitated capacitor (IDC) designs for tactile sensor integration45. However, such 2D capacitor structures present inherent limitations in sensor array scalability, including a low initial capacitance and reduced sensitivity (Figures S5 and S6). Therefore, in this study, a parallel-plate capacitor structure was adopted for the fabrication of soft tactile sensors.

Fig. 3. Performance evaluation of the parallel-plate capacitive tactile sensor with an ISD layer.

Fig. 3

a Schematic of the parallel-plate tactile sensor structure. b Surface roughness measurement of the fabricated ISD layer. c Sensor performance variation based on the roughness of the ISD layer. d Performance evaluation of the soft tactile sensor based on additive printing cycles. e Detection of an ultralight load ( ~ 23 Pa) using the proposed soft tactile sensor, demonstrating its high sensitivity, thus mimicking human tactile perception. f Response and recovery time analysis of the tactile sensor. The sensor exhibits a response time of approximately 48 ms and a recovery time of approximately 204 ms, influenced by the high surface energy of the soft elastomer substrate, with a sampling rate of 60 Hz. g Evaluation of the sensor’s ability to detect pressure variations. The soft tactile sensor accurately distinguishes quantitative weight differences, similar to human tactile sensation. h Evaluation of the sensor’s durability under cyclic loading, showing stable performance after 5000 loading–unloading cycles under an applied pressure of 6.52 kPa

The proposed parallel-plate capacitive soft tactile sensor incorporates a microstructured dielectric layer rather than a uniform dielectric layer to achieve high sensitivity and a fast response time4648. The fabricated microstructured dielectric layer was developed using sandpaper with an irregular surface, as opposed to conventional materials using fabrication techniques such as three-dimensional (3D) printing or semiconductor lithography. This approach enables a cost-effective and batch-process-compatible fabrication method49. Figure 3b and Figure S7 present the roughness (Ra) measurement results of the irregular surface dielectric (ISD) layer formed based on the sandpaper grit size. The roughness of the ISD is closely related to the sensor’s sensitivity (Fig. 3c). In this study, ISD 3 was selected as the final dielectric layer, as it exhibited the lowest roughness and demonstrated an excellent sensitivity of 0.3458 kPa−1 within a low-pressure range of 0–0.8 kPa.

Figure 3d and Figure S8 show the capacitance variation of the soft tactile sensor in response to applied pressure. The electrodes of the tactile sensor were classified based on the number of additive-printing cycles. The maximum measurable pressures for electrodes printed once, twice, and three times were 350, 450, and 600 kPa, respectively. These results indicate that increasing electrode height expands the range of cross-sectional area variation under pressure. The primary objective of this study was to develop a tactile sensor suitable for humanoid robots capable of replicating human activities. Accordingly, a single-printed electrode was selected, as it exhibited the most stable response and the lowest hysteresis within approximately 300 Pa, which corresponds to the typical pressure range perceived by human fingertips50. Experimental results shown in Figures S9 and S10 indicate that the deformable soft sensor with single-printed electrodes demonstrated approximately a 130% improvement in performance compared with a conventional flexible sensor with a parallel-plate capacitor structure under the same load conditions. These findings highlight additive printing technology as a powerful tool for fabricating tactile sensors tailored to various humanoid robot applications with varying pressure requirements.

The ability of the fabricated soft tactile sensor to replicate tactile perception was experimentally validated. To achieve human-level tactile sensitivity, a sensor must be capable of detecting minute loads. The proposed tactile sensor successfully detected an ultralight load of approximately 23 Pa (Fig. 3e). The distinctive properties of the sensor, resulting from the integration of a microstructured dielectric layer and a deformable electrode, were further analyzed. The experimental results for response and recovery times are presented in Fig. 3f. These measurements were acquired using a sampling rate of 60 Hz to ensure accurate temporal resolution. When external pressure was applied, the relative capacitance increased from its minimum to maximum value within approximately 48 ms. Conversely, upon pressure release, the recovery time was measured to be approximately 204 ms, which was slightly longer than the response time; this was attributed to the high surface energy of the soft elastomer substrate. Furthermore, the practical applicability of the fabricated sensor was assessed through contact detection performance tests. Initially, the sensor’s ability to detect pressure variations corresponding to quantitative weight changes was evaluated. The proposed soft tactile sensor effectively distinguished quantitative weight differences, mimicking the way humans perceive variations in contact force through tactile sensation (Fig. 3g). Finally, the response stability of the sensor was evaluated under 5000 loading cycles with an applied pressure of 6.52 kPa (Fig. 3h). The sensor maintained stable performance throughout the test, and cross-sectional inspection before and after cycling confirmed that no electrode detachment from the substrate occurred (Figure S13).

Figure 4a illustrates the structure of the expanded soft tactile sensor array. The sensor array was designed in a passive matrix configuration, with 100-pixel electrodes arranged on both the top and bottom layers, separated by a microstructured dielectric layer. Pressure data collected from each pixel electrode were analyzed sequentially using a multiplexing technique (Figure S11). The soft tactile sensor array comprises 13.5 electrodes per cm² (Fig. 4b), which, although lower than the density of tactile receptors in the human fingertip, enables not only basic contact detection but also shape recognition through its electrode distribution, functionally resembling the spatial arrangement of human tactile receptors5153. Furthermore, the sensor array exhibited a tensile strength of approximately 7.06 MPa, which is comparable to that of human finger skin (4.6–20 MPa)54,55. This similarity enables localized deformation, rendering the sensor well-suited for precise tactile-sensing applications. The soft tactile sensor array, fabricated using an inkjet printing process, demonstrates scalability in both large-area fabrication and miniaturization (Fig. 4c). This adaptability makes it suitable for a range of applications, including palm-sized (66.4 × 66.4 mm, 13.5 sensors/cm²) and finger-sized (9.45 × 9.45 mm, 71.7 sensors/cm2) implementations. Figure 4d presents a conceptual diagram of the ideal shape visualization when a ring-shaped structure, smaller than the sensor, is placed on its surface. Theoretically, the developed soft tactile sensor should enable precise shape detection of objects measuring several tens of millimeters. Its mechanical response was evaluated by performing finite element analysis (FEA) in COMSOL Multiphysics; the stress distributions of the developed soft tactile sensor substrate was compared with that of an existing flexible tactile sensor substrate based on their respective Young’s modulus (Fig. 4e). The results indicate that, in contrast to the conventional flexible sensor, the soft tactile sensor exhibits localized deformation, thereby facilitating fine shape detection similar to that of human skin.

Fig. 4. Structural design, scalability, and shape-sensing capabilities of the expanded soft tactile sensor array.

Fig. 4

a Structural design of the expanded soft tactile sensor array. b Electrode density and skin-like mechanical properties. c Scalability and adaptability of the soft tactile sensor array. d Theoretical shape visualization of a ring-shaped structure. e Stress distribution simulation based on substrate materials. f Comparison of average pressure data between contact and noncontact regions when a ring-shaped structure is placed on the sensor. The developed soft tactile sensor exhibits a significantly greater pressure difference than a conventional flexible sensor, demonstrating reduced noise propagation and improved fine shape detection

Based on the simulation results, the developed soft tactile sensor array was employed for shape-sensing visualization, demonstrating its applicability in micromanipulation. Figure 4f presents a graph depicting the average pressure data measured in both the contact and noncontact regions when a ring-shaped structure was placed on the sensor. The developed soft sensor exhibits less stress dispersion upon contact, resulting in significantly reduced noise propagation. As a result, it shows a much greater pressure difference between areas where the target structure makes contact and where it does not, thereby improving shape detection performance compared to a conventional flexible sensor. This reduced noise propagation is crucial for accurately detecting fine structures, such as ring-shaped geometries. Furthermore, Figure S12 demonstrates that the sensor can capture not only ring structures but also other complex geometries. These findings suggest that soft material-based high-precision shape-sensing tactile sensors can significantly enhance micromanipulation capabilities, particularly when integrated into humanoid robots.

Visual perception for millimeter-sized objects with irregular surfaces

The object-shape reconstruction capability of the RGB-Depth camera was then evaluated, thereby realizing the performance assessment of the vision-based part of the system. The vision system comprised an RGB camera and a stereo depth sensor equipped with an infrared (IR) projector. The evaluation experiment was performed by maintaining a fixed distance of 30 cm between the object and camera, reflecting the arm length of the robot while varying the illumination conditions. When only the RGB camera was utilized, object position information could be acquired; however, distinguishing height variations on the object’s surface proved challenging. Moreover, under low illuminance conditions below 10 lx, the Mean Average Precision at Intersection over Union (IoU) 0.5 (mAP50) decreased to 0.706, whereas under normal lighting above 100 lx, the model maintained a highly stable performance with an mAP50 of approximately 0.995. This degradation is a typical phenomenon observed in RGB based vision systems, as insufficient lighting makes it difficult to capture reliable color, contrast, and contour information of the target object. Here, mAP50 is a standard evaluation metric for object detection models, measuring whether the predicted bounding box achieves at least 50% IoU with the ground truth. Thus, the reduced mAP50 in low light conditions indicates that the model struggles to accurately estimate the object’s position and outline, demonstrating that RGB based vision alone becomes less reliable for object recognition when illumination is unstable (Fig. 5a). This limitation can be partially mitigated through the incorporation of a depth module. The depth module, equipped with an IR projector, facilitated the recognition of both the overall shape and position of the object, irrespective of illumination conditions (Fig. 5b). However, the depth module primarily provides macroscopic depth information, such as object boundaries; it exhibits limitations in capturing fine surface details within the object. By contrast, the developed soft tactile sensor effectively detected fine surface details, even under varying environmental conditions, by directly interacting with the target object. This tactile sensing capability complements the limitations of the vision system, enabling the acquisition of surface information that visual input alone could not provide (Fig. 5c).

Fig. 5. Experimental evaluation of object shape reconstruction using RGB-Depth camera under varying illumination conditions.

Fig. 5

a RGB camera-based object recognition showing degraded performance under low illumination. b Depth module-assisted shape detection. c Tactile sensor-based fine surface detection

Evaluation of complementary visual and tactile sensing for object recognition

The performance of the tactile sensor, as a means to complement or substitute the vision system, was evaluated through experiments assessing its ability to detect fine surface structures. Figure 6a illustrates the conceptual framework for complementarily utilizing humanlike vision–tactile perception into a robotic system. The robot initially acquires spatial and positional information about the target object from a distance using an RGB-Depth camera.

Fig. 6. Complementary vision-tactile sensing for object recognition.

Fig. 6

a Conceptual framework of sequential vision-tactile perception, shape and surface detection for (b) single protrusion, (c) multiple protrusions of the same height, and (d) multiple protrusions of different heights

In situations where visual perception is constrained by environmental factors, fine surface features are acquired through direct contact with the object via the soft tactile sensor. An experiment was conducted to examine the impact of a complementary vision–tactile sensing approach on recognizing objects with complex surface structures for humanoid robotics applications. The detection of a single object with microscopic surface protrusions (4.8 × 4.8 × 3 mm in width, length, and height) was independently investigated using the vision and tactile systems (Fig. 6b). When only the RGB camera was employed, the system determined the presence and position of the object based on pre-trained data within the camera’s field of view. Moreover, scanning the object’s surface with the tactile system verified the presence of surface protrusions and precisely captured height variations. The developed tactile system successfully detected not only a single protrusion but also multiple protrusions of varying sizes (each of width and length 2 × 2 mm, 4.8 × 4.8 mm and 7.6 × 7.6 mm) simultaneously (Fig. 6c). Furthermore, the tactile system effectively identified height variations in surface protrusions of different heights (2 mm, 3 mm, and 4 mm) (Fig. 6d). These results indicate that vision and tactile sensors serve complementary functions. While each modality has its strengths, the tactile sensor was particularly effective in capturing surface shape information that is challenging to obtain through vision, suggesting its potential to support shape perception under visually constrained conditions. Finally, this study evaluated the potential of the tactile sensor to substitute for vision under visually constrained conditions. The experiment was designed to mimic the sequential perception process of humans, in which visual sensing is used first to recognize the object and estimate its position. Then, under conditions where visual information becomes limited due to environmental changes, the tactile sensor was employed to assess whether it could compensate for the information that is difficult to obtain visually.

Figure 7a presents the results of the object recognition experiment conducted using a vision sensor. In this experiment, the YOLOv11 algorithm56 was employed for object detection in RGB images, with training data collected using an RGB-Depth camera. Various color correction and data augmentation techniques were applied to enhance the sensor’s performance under different environmental conditions. The trained vision system successfully measured the shape of a tablet blister pack (6.6 × 13 cm width and length) and accurately recognized objects within the typical operational range of the robotic arm ( ~ 400 mm)57. However, despite bright conditions (above 1200 lx), object detection was hindered due to focus issues at close range and occlusion. Additionally, it failed under highly low-light conditions (below 50 lx). These limitations emphasize the need for an alternative sensing modality, suggesting that tactile sensing may not only serve as a complementary input but also as a substitute when visual perception is unreliable or fails.

Fig. 7. Evaluation of sensory substitution via tactile scanning under visually constrained conditions.

Fig. 7

a Object recognition experiment using a vision sensor with an RGB-Depth camera and YOLOv11-based detection under various environmental conditions. b Surface texture and geometric feature detection using a soft tactile sensor with a tactile scanning technique

Visual sensing limitations were addressed by investigating the potential of tactile sensing in conditions where visual sensing was infeasible. Figure 7b illustrates the experimental setup used to recognize surface textures and geometric features at the millimeter scale using a soft tactile sensor. As the tactile sensor was smaller than the target object, simultaneous scanning of the entire surface was not possible. Hence, a tactile scanning technique was implemented, segmenting the object’s surface into multiple regions and integrating the scanned data to reconstruct a complete surface profile. Relative surface height variations were introduced by randomly removing tablets from the tablet blister pack. The final visualization confirmed that the system successfully reproduced fine surface features at the millimeter level, accurately capturing the shapes of individual pills (7 × 18 mm width and length). These results suggest that in situations where visual perception is restricted, tactile sensing can identify the shape and intricate surface patterns of objects through direct contact. This capability not only complements visual information but may also serve as a viable substitute for vision. This function extends beyond a simple auxiliary modality and proposes the possibility of incorporating a sensory substitution mechanism.

Conclusion

This study proposed a complementary visual localization and tactile mapping approach that enables stable object perception across diverse environments, ultimately providing a technical foundation for precise robot manipulation. The proposed method utilizes a single RGB-Depth camera to gather spatial and positional information about the target object. In visually constrained conditions due to environmental changes, it also incorporates a soft, inkjet-printed tactile sensor as a complementary modality. Subsequently, the developed tactile scanning system successfully mapped the shape and millimeter-scale surface height variations of the object. These results suggest that the tactile sensor not only complements visual information but can also serve as an alternative sensing modality when visual perception is limited.

The findings of this study provide a foundation for designing sensory substitution mechanisms that enhance the autonomy and adaptability of robots in unstructured environments through a simple system inspired by the human sequential sensory perception process, ultimately enabling reliable perception and control for precise robot manipulation. Future work will focus on developing algorithms that integrate sequential visual and tactile inputs in a complementary manner to improve real-time applicability and extend system performance under diverse environmental and object conditions. Additionally, efforts will be directed toward improving the resolution of the tactile sensor to further increase the precision of manipulation tasks.

Methods

Fabrication of pressure sensor

The soft substrate was prepared by drop-casting a solution of styrene–ethylene–butylene–styrene (SEBS) H1062 (Asahi Kasei, Japan) and toluene (1:10) onto a trichlorosilane (99%, Sigma-Aldrich)-treated glass surface. Subsequently, silver (Ag 50–90%, benzyl alcohol 1–50%, NanoPaint) electrodes were inkjet-printed onto the SEBS H1062-coated glass using a Nova (Voltera) printer. The inkjet printing parameters were optimized prior to sensor fabrication. The nozzle diameter was selected to match the designed line width, and the printing speed was adjusted to obtain continuous and uniform line formation. The printing height, defined as the distance between the nozzle and the substrate, was subsequently tuned to ensure stable deposition without line discontinuity. The final parameter set was determined by comparing the printed patterns with the designed geometries to ensure high fidelity and stable electrode formation (Figure S14). The printed electrodes were then cured at 120 °C for 20 min to remove the solvent, resulting in the formation of solid Ag electrodes. An ISD thin film was fabricated by spin-coating a SEBS H1062/toluene solution (1:10 ratio, identical to the substrate) onto a trichlorosilane-treated irregularly structured replica mold. The coated film was subsequently heated at 120 °C for 1 min to eliminate any residual solvent. The structured layer, composed of SEBS H1062, exhibited elasticity across a broad temperature range. Adhesion was enhanced by spin-coating a layer of sticky and highly deformable ISD SEBS H1221 under identical conditions. Due to the extreme thinness of the ISD layer, a water-soluble tape was applied to facilitate its transfer onto the SEBS substrate with Ag electrodes.

Electrical & physical characterization

Pressure was applied by simultaneously connecting and operating a force gauge (Series 7, Mark-10) and a motorized stage (ESM303, Mark-10). Capacitance measurements of the tactile sensor under applied pressure were conducted using an LCR meter (E4908a, Keysight) at 1 V and 2 MHz, with data acquisition via NI LabVIEW program (NI, National Instruments). Electrical resistance variations due to electrode and substrate deformation were measured by applying, 1 V DC using a source meter (Model 2400, Keithley). The surface roughness (Ra) of the ISD was measured using a DektakXT profilometer (DXT-A). The equation for roughness measurement is as follows:

Ra=1L0L|rx|dx

Stress distribution was simulated using COMSOL Multiphysics (v6.2) with a Solid Mechanics module. Material properties, such as Young’s modulus, were defined, and a pressure load of 0.216 N was applied. The von Mises stress was analyzed to evaluate mechanical performance under fixed boundary conditions. Surface and slice plots were used to visualize the stress distribution in the model.

System configuration

A Z-axis conductive tape (ACF, 3 M) was affixed to the contact pad of the tactile sensor array and a flexible flat cable was attached to connect the sensor. A multiplexer (CD74HC4067, SparkFun) was employed to sequentially measure pressure from each electrode in the sensor array. The acquired pressure data were transmitted to a microcontroller unit (MCU) (Arduino Uno) via serial communication, where they were converted into digital values. The processed data were visualized in real-time using MATLAB plots, reshaped into a 10 × 10 matrix, and displayed using an interpolated color map. To establish a stable baseline after sensor integration, the initial capacitance was calibrated in the attached state. The MCU first collected an 10 × 10 set of readings, discarded the first nine points to remove transients, and stored the tenth value as the baseline C0. Subsequent measurements were processed in real time as ΔC/C0. This procedure ensured stable reference values even when the sensor was mounted on slightly curved surfaces. RGB and depth images were captured using an RGB-Depth camera (Intel RealSense D455).

Object detection using YOLO algorithm

The YOLOv11 algorithm was employed to detect objects in the RGB images with the training data collected using an RGB-Depth camera. A customized training pipeline was designed to enhance the robustness and performance of the model. The model was trained for 100 epochs with a batch size of 16 and input image size of 224×224. The generalization and ability to address variations in the data was improved by applying several augmentation techniques: hue shift (hsv_h = 0.015), saturation gain (hsv_s = 0.7), and value gain (hsv_v = 0.4) to account for color variations, along with vertical and horizontal flipping (probability = 0.5 each) to introduce spatial diversity. Mosaic augmentation was used to create more complex training samples by combining multiple images, and random rotations within 30° further diversified the dataset. These augmentations, combined with the optimized hyperparameters, were designed to maximize the detection performance of the model for a given dataset.

Supplementary information

Acknowledgements

This work was initiated by the Samsung Electronics Future Technology Development Center (Grant No. SRFC-TD2103-01). This study was supported by the National Research Foundation Grant funded by the Korean government (MIST) (No. RS-2024-00338772, No. RS-2024-00461583, No. RS-2024-00411007). This research was also supported by a faculty research grant of Yonsei University College of Medicine (6-2024-0023, 6-2025-0023).

Data availability

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflict of interest

The authors declare no competing interests.

Footnotes

These authors contributed equally: Jaehwan Jang, Byeong-Sun Park

Supplementary information

The online version contains supplementary material available at 10.1038/s41378-026-01190-8.

References

  • 1.Jiang, Z., Cao, X., Huang, X., Li, H. & Ceccarelli, M. Progress and Development Trend of Space Intelligent Robot Technology, Space Sci. Technol. 2022, Article 9832053. 10.34133/2022/9832053 (2022).
  • 2.Dong, D., Wang, Z., Guan, J., Xiao, Y. & Wang, Y. Research on key technology and application progress of rescue robot in nuclear accident emergency situation. Nucl. Eng. Technol.57, 103457 (2025). [Google Scholar]
  • 3.Jiang, Z. et al. FC-EODR: Immersive Humanoid Dual-Arm Dexterous Explosive Ordnance Disposal Robot. Biomim. (Basel)8, 67 (2023). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Tsagarakis, N. G. et al. WALK-MAN: A High-Performance Humanoid Platform for Realistic Environments. J. Field Robot.34, 1225–1259 (2017). [Google Scholar]
  • 5.Catalano, M. G. et al. Adaptive synergies for a humanoid robot hand. Humanoids7, 2012–2014 (2012). [Google Scholar]
  • 6.Gams, A., Petrič, T., Nemec, B. & Ude, A. Rep. 3, Curr. Robot. Rep. 3, 97–109. 10.1007/s43154-022-00082-9 (2022).
  • 7.Lee, M. A. et al. Making Sense of Vision and Touch: Self-Supervised Learning of Multimodal Representations for Contact-Rich Tasks, Proc. IEEE Int. Conf. Robot. Autom. (ICRA.), 8943–8950. 10.1109/ICRA.2019.8793485 (2019).
  • 8.Gautham, V. et al. Slip-actuated bionic tactile sensing system with dynamic DC generator integrated E-textile for dexterous robotic manipulation. Nat. Commun.16, 7005 (2025). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Gao, S., Dai, Y. & Nathan, A. Tactile and Vision Perception for Intelligent Humanoids. Adv. Intell. Syst.4, 2100074 (2022). [Google Scholar]
  • 10.Wang, T. et al. Multimodal Sensors Enabled Autonomous Soft Robotic System with Self-Adaptive Manipulation. ACS Nano18, 9980–9996 (2024). [DOI] [PubMed] [Google Scholar]
  • 11.Q. Ke. et al. Computer Vision for Assistive Healthcare (Eds.: Leo M., Farinella G. M.), Academic Press, pp. 127–145. 10.1016/B978-0-12-813445-0.00005-8 (2018).
  • 12.Feng, L. & Xu, Y. Proceedings of the 2021 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), IEEE, New York, pp. 126–129. 10.1109/CMMNO53328.2021.9467535 (2021).
  • 13.O’Callaghan, K. J. & O’Mahony, M. J. In Proceedings of the 2010 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 1–6. 10.1109/ICSMC.2010.5687345 (2010).
  • 14.Grubb, G., Zelinsky, A., Nilsson, L. & Rilbe, M. In 3D vision sensing for improved pedestrian safety, Proceedings of the 2004 IEEE Intelligent Vehicles Symposium, IEEE, New York, pp. 19–24. 10.1109/IVS.2004.1336349 (2004).
  • 15.Wu, Y., Wang, Y., Zhang, S. & Ogai, H. Deep 3D Object Detection Networks Using LiDAR Data: A Review. IEEE Sens. J.21, 1152–1171 (2021). [Google Scholar]
  • 16.Zhou, Z. et al. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), IEEE, New York, pp. 7808–7815. 10.1109/ICRA48891.2023.10161563 (2023).
  • 17.Rahman, M. M., Tan, Y., Xue, J. & Lu, K., In Proceedings of the 2017 IEEE International Conference on Multimedia and Expo (ICME), IEEE, New York, pp. 991–996. 10.1109/ICME.2017.8019538 (2017).
  • 18.Suresh, S. et al. NeuralFeels with neural fields: Visuotactile perception for in-hand manipulation. Sci. Robot9, eadl0628 (2024). [DOI] [PubMed] [Google Scholar]
  • 19.Dockstader, S. L. & Tekalp, A. M. In Proceedings of the 2001 IEEE Workshop on Multi-Object Tracking, IEEE Comput. Soc, pp. 95–102. 10.1109/MOT.2001.937987 (2001).
  • 20.Luo, S., Mou, W., Althoefer, K. & Liu, H. Novel Tactile-SIFT Descriptor for Object Shape Recognition. IEEE Sens. J.15, 5001–5009 (2015). [Google Scholar]
  • 21.Schneider, A. et al. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, New York, pp. 243–248. 10.1109/IROS.2009.5354648 (2009).
  • 22.Wang, L. et al. A flexible dual-mode triboelectric sensor for strain and tactile sensing toward human-machine interface applications. Sens. Actuators A.365, 114909 (2024). [Google Scholar]
  • 23.Han, C. et al. Flexible Tactile Sensors for 3D Force Detection. Nano Lett.24, 5277–5283 (2024). [DOI] [PubMed] [Google Scholar]
  • 24.Mao, C., Jin, J., Mei, D. & Wang, Y. Development of Kirigami-Patterned Stretchable Tactile Sensor Array with Soft Hinges for Highly Sensitive Force Detection. Adv. Sens. Res.3, 2400012 (2024). [Google Scholar]
  • 25.Yellapantula, K. et al. Soft and flexible sensor array using carbon black pillars for object recognition via pressure mapping. Measurement159, 107781 (2020). [Google Scholar]
  • 26.Zhu, H., Luo, H., Cai, M. & Song, J. A Multifunctional Flexible Tactile Sensor Based on Resistive Effect for Simultaneous Sensing of Pressure and Temperature. Adv. Sci.11, e2307693 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Xuewei, S. & Lee, A. A 3D cross-linked hierarchical hydrogel E-skin with sensing of touch position and pressure. Carbon216, 118514 (2023). [Google Scholar]
  • 28.Liu, M. et al. Tactile Sensing and Rendering Patch with Dynamic and Static Sensing and Haptic Feedback for Immersive Communication. ACS Appl. Mater. Interfaces16, 53207–53219 (2024). [DOI] [PubMed] [Google Scholar]
  • 29.Sandykbayeva, D., Kappassov, Z. & Orazbayev, B. VibroTouch: Active Tactile Sensor for Contact Detection and Force Sensing via Vibrations. Sensors22, 6456 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Yu, H. et al. Skin-Inspired Capacitive Flexible Tactile Sensor with an Asymmetric Structure for Detecting Directional Shear Forces. Adv. Sci.11, e2305883 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Mao, Q., Liao, Z., Yuan, J. & Zhu, R. Multimodal tactile sensing fused with vision for dexterous robotic housekeeping. Nat. Commun.15, 6871 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Li, S. et al. When Vision Meets Touch: A Contemporary Review for Visuotactile Sensors From the Signal Processing Perspective. IEEE J. Sel. Top. Signal Process.18, 267–287 (2024). [Google Scholar]
  • 33.Cong, Y. et al. A Comprehensive Study of 3-D Vision-Based Robot Manipulation. IEEE Trans. Cybern.53, 1682–1698 (2023). [DOI] [PubMed] [Google Scholar]
  • 34.Zhang, F. & Demiris, Y. IEEE Robot. Autom. Lett. 8, 5512. 10.1109/LRA.2023.3296371 (2023).
  • 35.Tahoun, M., Tahri, O., Corrales Ramón, J. A. & Mezouar, Y. In Visual-Tactile Fusion for 3D Objects Reconstruction from a Single Depth View and a Single Gripper Touch for Robotics Tasks, Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS), 6786–6793. 10.1109/IROS51168.2021.9636150 (2021).
  • 36.Dikhale, S. et al. VisuoTactile 6D Pose Estimation of an In-Hand Object Using Vision and Tactile Sensor Data. IEEE Robot. Autom. Lett.7, 2148–2155 (2022). [Google Scholar]
  • 37.Lee, K., Qin, Y., Wang, X. & Lim, S. DexTouch: Learning to Seek and Manipulate Objects With Tactile Dexterity. IEEE Robot. Autom. Lett.9, 10772–10779 (2024). [Google Scholar]
  • 38.Jin, P., Huang, B., Lee, W. W., Li, T. & Yang, W. Visual-Force-Tactile Fusion for Gentle Intricate Insertion Tasks. IEEE Robot. Autom. Lett.9, 4830–4837 (2024). [Google Scholar]
  • 39.Krivačić, S., Boček, Ž, Zubak, M., Kojić, V. & Kassal, P. Flexible ammonium ion-selective electrode based on inkjet-printed graphene solid contact. Talanta279, 126614 (2024). [DOI] [PubMed] [Google Scholar]
  • 40.Jing, J. et al. The role of substrates and electrodes in inkjet-printed PEDOT:PSS thermoelectric generators. J. Mater. Chem. C.12, 6185–6192 (2024). [Google Scholar]
  • 41.Kant, C., Mahmood, S., Seetharaman, M. & Katiyar, M. Large-Area Inkjet-Printed Flexible Hybrid Electrodes with Photonic Sintered Silver Grids/High Conductive Polymer. Small Methods8, e2300638 (2024). [DOI] [PubMed] [Google Scholar]
  • 42.Zhang, Q., Zhang, C., Song, H., Tang, W. & Zeng, Y. Soft Pressure Sensor Array Inspired by Human Skin for Detecting 3D Robotic Movement. ACS Appl. Mater. Interfaces17, 14604–14614 (2025). [DOI] [PubMed] [Google Scholar]
  • 43.Tu, S. et al. Skin-inspired interlocked microstructures with soft-hard synergistic effect for high-sensitivity and wide-linear-range pressure sensing. Chem. Eng. J.496, 154083 (2024). [Google Scholar]
  • 44.Song, Z. et al. Merkel receptor-inspired integratable and biocompatible pressure sensor with linear and ultrahigh sensitive response for versatile applications. Chem. Eng. J.444, 136481 (2022). [Google Scholar]
  • 45.Abdullah al Rumon, M. A. & Shahariar, H. Fabrication of interdigitated capacitor on fabric as tactile sensor. Sens. Int.2, 100086 (2021). [Google Scholar]
  • 46.Yuan, H. et al. Progress and challenges in flexible capacitive pressure sensors: Microstructure designs and applications. Chem. Eng. J.485, 149926 (2024). [Google Scholar]
  • 47.Yang, J. C. et al. Microstructured Porous Pyramid-Based Ultrahigh Sensitive Pressure Sensor Insensitive to Strain and Temperature. ACS Appl. Mater. Interfaces11, 19472–19480 (2019). [DOI] [PubMed] [Google Scholar]
  • 48.Berman, A. et al. Additively manufactured micro-lattice dielectrics for multiaxial capacitive sensors. Sci. Adv.10, eadq8866 (2024). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Rasel, M. S. U. & Park, J.-Y. A sandpaper assisted micro-structured polydimethylsiloxane fabrication for human skin based triboelectric energy harvesting application. Appl. Energy206, 150–158 (2017). [Google Scholar]
  • 50.Anirban, A. A human touch. Nat. Rev. Mater.4, 740 (2019). [Google Scholar]
  • 51.Johansson, R. S. & Vallbo, A. B. Tactile sensibility in the human hand: relative and absolute densities of four types of mechanoreceptive units in glabrous skin. J. Physiol.286, 283–300 (1979). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Ciano, J. & Beatty, B. L. Regional variation in the density of Meissner's corpuscles in human fingers. Ann. Anat.243, 151946 (2022). [DOI] [PubMed] [Google Scholar]
  • 53.Delmas, P., Hao, J. & Rodat-Despoix, L. Molecular mechanisms of mechanotransduction in mammalian sensory neurons. Nat. Rev. Neurosci.12, 139–153 (2011). [DOI] [PubMed] [Google Scholar]
  • 54.Murthe, S. S., Sreekantan, S. & Mydin, R. B. S. M. N. Study on the Physical, Thermal and Mechanical Properties of SEBS/PP (Styrene-Ethylene-Butylene-Styrene/Polypropylene) Blend as a Medical Fluid Bag. Polymers14, 3267 (2022). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Pailler-Mattei, C., Bec, S. & Zahouani, H. In vivo measurements of the elastic mechanical properties of human skin by indentation tests. Med. Eng. Phys.30, 599–606 (2008). [DOI] [PubMed] [Google Scholar]
  • 56.Khanam, R. & Hussain, M. arXiv, Vol. 2410. 10.48550/arXiv.2410.17725 (2024).
  • 57.Zhu, X. et al. IEEE Robot. Autom. Lett. 9, 1234. 10.1109/LRA.2024.1234567 (2024).

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.


Articles from Microsystems & Nanoengineering are provided here courtesy of Nature Publishing Group

RESOURCES