![]() TactileMNIST-v0 |
![]() Starstruck-v0 |
![]() Toolbox-v0 |
![]() TactileMNISTVolume-v0 |
Tactile MNIST is a benchmark for learning-based active perception algorithms. It introduces four simulated tactile perception tasks, ranging from classification and counting to pose and volume estimation. Each task comes with a unique set of challenges and, thus, Tactile MNIST requires adaptive algorithms and clever exploration strategies. The aim of Tactile MNIST is to provide an extensible framework for a fair comparison of active tactile perception methods.
Tactile MNIST implements each task as an ap_gym environment and is, thus, very straightforward to set up and use. In each task, the agent controls a single simulated GelSight Mini above a platform with some task-specific objects. The agent's objective is to make a prediction about some property of the objects it is exploring, such as their class, count, or pose.
In addition to the simulated benchmark tasks, this package provides access to a large dataset of real tactile images collected from 3D printed MNIST digits and a couple of synthetic datasets.
Further details can be found on our project page, which also links to the paper.
Install Tactile MNIST via pip:
pip install tactile-mnist[OPTIONS]where OPTIONS can be any number of the following (comma separated):
examples: installs dependencies for the examples.torch,jax, andjax-cpu: installs dependencies for the interactive Tactile MNIST environment with PyTorch, JAX, or JAX without CUDA support, respectively. Without any of those options, the interactive environment cannot be used but the static datasets will still work.
If you encounter problems during the installation or execution, check our troubleshooting section.
This package provides ap_gym environments for four simulated active tactile classification benchmark tasks and access to the Tactile MNIST datasets. The ap_gym environments can be used to train and evaluate agents on active tactile perception problems on simulated data and are further described in the Benchmark section. In the Tactile MNIST datasets, you find two datasets of 3D CAD models, MNIST 3D and Starstruck, and several datasets of simulated and real tactile images. This package provides an easy way of loading and working with these datasets, as further described in the Datasets section.
This package provides ap_gym environments for four active tactile perception environments: TactileMNIST, Starstruck, Toolbox, and TactileMNISTVolume. In all environments, the agent must solve a perception problem by actively controlling a GelSight Mini tactile sensor in a simulated environment.
The TactileMNIST environment challenges the agent to find and classify a 3D MNIST model as quickly as possible. aside from finding the object, the main challenge in the TactileMNIST environment is to learn contour following strategies to efficiently classify it once found.
In the Starstruck environment, the agent must count the number of stars in a scene cluttered with other objects. Since all stars look the same, distinguishing stars from other objects is rather straightforward. Instead, the main challenge posed in this environment is to learn an effective search strategy to systematically cover as much space as possible.
The Toolbox environment challenges the agent to locate a wrench positioned randomly on a platform and estimate its precise 2D position and 1D orientation. Unlike the previous classification tasks, Toolbox is poses a regression problem that requires combining multiple touch observations to resolve ambiguities inherent in the wrench’s shape. For example, touching the handle may reveal lateral placement but not longitudinal position or orientation, making it critical for the agent to explore strategically and seek out one of the wrench’s ends to accurately determine its pose. Overall, the Toolbox tests the agent’s ability to both find and precisely localize an object through sequential tactile exploration.
Finally, the TactileMNISTVolume environment poses another regression problem. Here, the agent must determine the exact volume of the 3D MNIST model it is given. Thus, unlike in the TactileMNIST environment, where a couple of touches might already be sufficient for classification, in TactileMNISTVolume, the agent has to make sure to explore the entire object.
A detailed description of the environments can be found here.
Aside of the simulated benchmark tasks, this package provides access to two classes of static datasets: 3D mesh datasets and touch datasets. Below is an overview of the datasets provided in this package:
- 3D Mesh Datasets:
- MNIST 3D: a dataset of 3D models generated from a high-resolution version of the MNIST dataset.
- Starstruck: a dataset in which the number of stars in a scene have to be counted (3 classes, 1 - 3 stars per scene).
- Touch Datasets
- Synthetic Tactile MNIST: a dataset of synthetic tactile images generated from the MNIST 3D dataset with the Taxim simulator.
- Real Tactile MNIST: a dataset of real tactile images of 3D printed MNIST 3D digits collected with a Franka robot.
- Synthetic Tactile Starstruck: a dataset of synthetic tactile images generated from the Starstruck dataset with the Taxim simulator.
A detailed description of the datasets can be found in the Dataset documentation.
If you are seeing errors as such
OSError: [...]/torch_scatter/_version_cpu.so: undefined symbol: _ZN3c1017RegisterOperatorsD1Ev
then there are CUDA version incompatibilities between torch_scatter, PyTorch, and nvcc.
If you are seeing errors as such
RuntimeError: Not compiled with CUDA support
then nvcc was not found when installing torch_scatter.
In both cases, follow the instructions in the official torch_scatter repository to install compatible PyTorch and torch_scatter versions.
The project is licensed under the MIT license.
If you wish to contribute to this project, you are welcome to create a pull request. Please run the pre-commit hooks before submitting your pull request. To install the pre-commit hooks, run:
- Install pre-commit
- Install the Git hooks by running
pre-commit installor, alternatively, run `pre-commit run --all-files manually.



