Skip to main content
. 2020 Feb 26;30:105335. doi: 10.1016/j.dib.2020.105335

Specifications Table

Subject Artificial Intelligence
Specific Subject Area 3D Shape Modeling and Completion
Type of Data Matrices including 3D point coordinates as .mat files and object scans as .obj files
How data were acquired The data was acquired using two robot setups. The first robot is composed of a 6-degree-of-freedom KUKA arm, a three-finger Schunk Dextrous Hand (7 degrees of freedom) equipped with tactile sensing arrays and a Kinect stereo vision camera. The second robot is a PR2 robot. ROS was used in programming robot motions, communication and data recording. The data was acquired by letting the robot hands touch the experiment objects at predefined locations and recording tactile and visual measurements from tactile sensors on fingers and Kinect cameras, in the form of 3D point clouds, which later on were mapped to the same reference frame based on camera calibrations and registration to initial frame. We also provide 3D scans of objects for comparing shape approximations from real sensory data to ground truth.
Data format Raw
Parameters for data collection The explorative touch locations were discretized, given a fixed object pose, i.e. we used a fixed number of approach directions and heights to touch objects.
Description of data collection The robots touch the experiment objects in a predefined manner at various exploration configurations and gather visual and tactile points in the same coordinate frame, based on calibration between the robots and the used cameras and registration to initial frame.
Data source location KTH Royal Institute of Technology, Stockholm, Sweden
Repository name: Visual and Tactile 3D Point Cloud Data from Real Robots for Shape Modeling and Completion
Data accessibility Data identification number: DOI: https://doi.org/10.17632/ztkctgvgw6.1
Direct URL to data: https://data.mendeley.com/datasets/ztkctgvgw6
Related Research Article G. Zarzar Gandler, C. H. Ek, M. Björkman, R. Stolkin, Y. Bekiroglu, Object shape estimation and modeling, based on sparse Gaussian process implicit surfaces, combining visual data and tactile exploration, Robotics and Autonomous Systems, 2020, doi: https://doi.org/10.1016/j.robot.2020.103433.