The University of Bristol has recently made strides in the development of robotic hands, introducing a design equipped with tactile sensors embedded in its four fingertips. These sensors enable the robotic hand to rotate objects in any direction or orientation, showcasing a significant advancement in robotic dexterity. For more details, you can explore the research team’s findings on the University of Bristol’s news page.
Improving Robotic Dexterity
Researchers led by Professor Nathan Lepora focused on enhancing the capabilities of robotic hands using inexpensive tactile sensors. By embedding cellphone cameras in the fingertips, they were able to capture detailed tactile interactions between the hand and objects. This innovation could have major implications for automated handling in various industries, such as retail and recycling.
Past Efforts by OpenAI
In 2019, OpenAI explored robotic grasping before shifting its focus to generative AI, pausing its robotic research. However, OpenAI has recently announced the revival of its robotics division. While the Bristol team employs cellphone cameras for tactile sensing, other research teams have used different methods like proprioception and touch sense to manipulate objects.
Bristol’s Innovative Approach
The Bristol team’s approach involves using a 3D-printed mesh of pin-like papillae on the artificial fingertip, mimicking the internal structure of human skin. This design allows the robotic hand to perform complex tasks, such as rotating objects around any axis while moving. The team has successfully trained a unified policy to handle various rotation axes in different hand orientations, a feat not previously achieved.
Comparative studies show that other research teams have achieved object manipulation with downward-facing hands using a gravity curriculum or precise grasp manipulation. In contrast, the Bristol team’s unified policy enables the hand to perform these tasks regardless of orientation. Their research highlights the potential for more advanced dexterity tasks, including assembling items like Lego blocks, paving the way for future developments in robotic manipulation.
This research also has practical applications in humanoid robotics, where tactile sensors and advanced manipulation capabilities are crucial. While the Bristol team focuses on foundational research, companies like FingerVision are already commercializing similar technologies. FingerVision’s tactile gripper, demonstrated at the 2024 CES event in Las Vegas, is being used in food-handling applications, showcasing the real-world potential of these advancements.
The ongoing research at the University of Bristol represents a significant step forward in robotic dexterity. The team’s innovative use of tactile sensors in combination with AI training methods has yielded impressive results, enabling complex object manipulation tasks. As these technologies continue to evolve, they hold promise for a wide range of applications, from industrial automation to advanced humanoid robots.