Content
summary Summary

A team from Purdue University has introduced a new technique called UniT that allows robots to process tactile information more efficiently and transfer it to various tasks.

Ad

Scientists at Purdue University have developed a method that enables robots to process tactile information more effectively. The technique, dubbed UniT (Unified Tactile Representation), uses a specialized machine learning approach to create a versatile representation from simple tactile data.

The goal is to provide robots with a sense of touch similar to how humans use it to interact with objects, in addition to visual information. What's unique about UniT is that the tactile representation can be trained with data from just one simple object. In the experiments, the researchers used a hex key or a small ball for this purpose.

Video: Xu, Uppuluri et al.

Ad
Ad

To capture the tactile data, a GelSight sensor was used. It consists of an elastic gel with embedded markers that deform upon touch. A camera captures these deformations. These images show how the object is pressed or moved, providing information about its shape, position, and the forces acting on it.

The researchers then use a VQVAE to store the information from the tactile images in a compact form. This allows the robot to learn efficiently and apply the skills to various tasks.

UniT significantly improves robot performance

Experiments showed that the representation learned with UniT could be well transferred to unknown objects. For example, it was possible to reconstruct the contact geometry and force distribution when touching various objects, even though the system was only trained with a simple object.

UniT also enables the robot to handle different tasks without additional training, such as recognizing the position of a USB connector or precisely grasping objects. UniT shows better results than previous methods that rely only on visual information or treat the sense of touch like an additional camera.

Video: Xu, Uppuluri et al.

Recommendation

The researchers demonstrated the effectiveness of UniT through several robotic tasks. In 3D pose estimation of a USB connector, the method significantly outperformed other approaches. UniT-based control also proved superior in more exotic manipulation tasks such as hanging chicken legs or grasping fragile chips.

Video: Xu, Uppuluri et al.

Video: Xu, Uppuluri et al.

The researchers see potential for further developments. In the future, the method could be extended to soft objects or complemented with physical models.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

More information and the code are available on GitHub.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Researchers at Purdue University have developed UniT, a technique for more efficiently processing tactile information for robots. UniT uses machine learning to create rich representations from simple tactile data.
  • A GelSight sensor with elastic gel and embedded markers is used for data acquisition. A camera records deformations upon contact. A VQVAE stores the information compactly, allowing efficient learning and transfer to different tasks.
  • In experiments, UniT showed better results than previous methods in tasks such as 3D pose estimation of a USB plug or gripping fragile objects. The researchers see potential for further development, for example with soft objects or the integration of physical models.
Sources
Max is managing editor at THE DECODER. As a trained philosopher, he deals with consciousness, AI, and the question of whether machines can really think or just pretend to.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.