MIT researchers have developed an interactive design pipeline that streamlines and simplifies the process of manufacturing a custom robotic hand with tactile sensors.
Typically, a robot expert spends months manually designing a custom manipulator, mainly through trial and error. Each iteration may require new parts to be designed and tested from scratch. In contrast, this new pipeline requires no manual assembly or special skills.
Similar to building with digital LEGOs, a designer uses the interface to construct a robotic manipulator from a set of modular components that are guaranteed to be manufactured. The user can customize the palm and fingers of the robot hand to tailor it to a specific task and then easily incorporate tactile sensors into the final design.
When the design is complete, the software automatically generates 3D prints and knit files to make the manipulator. The tactile sensors are integrated through a knitted glove that sits close to the robot’s hand. These sensors allow the manipulator to perform complex tasks such as B. picking up delicate objects or using tools.
“One of the most exciting things about this pipeline is that it makes design accessible to a wide audience. Instead of working on a design for months or years and investing a lot of money in prototypes, you can get a working prototype in a few minutes, ”says lead author Lara Zlokapa, who will take her master’s degree in mechanical engineering this spring.
In addition to Zlokapa, her advisors Pulkit Agrawal, Professor at the Laboratory of Computer Science and Artificial Intelligence (CSAIL), and Wojciech Matusik, Professor of Electrical Engineering and Informatics, are involved in the work. Other co-authors are CSAIL graduate students Yiyue Luo and Jie Xu, mechanical engineer Michael Foshey and Kui Wu, a senior researcher at Tencent America. The research will be presented at the International Conference on Robotics and Automation.
Think of the modularity
Before Zlokapa started working on the pipeline, Zlokapa stopped to consider the concept of modularity. She wanted to create so many components that users could combine them flexibly, but not so many that they were overwhelmed by choices.
She thought creatively about the function of the components, not their shape, and came up with about 15 parts that can be combined into trillions of unique manipulators.
Researchers then focused on developing an intuitive interface where the user can mix and match components in a 3D design space. A set of production rules called graph grammar controls how the user can combine the parts depending on how the individual components, e.g. B. a joint or a finger shaft fit together.
“If we think of it as a LEGO kit where you can put different building blocks together, then the grammar can be something along the lines of ‘red bricks can only be placed on top of blue bricks’ and’ blue bricks can not be placed on top of green bricks “Using graph grammar, we can ensure that every single design is valid, which means that it makes physical sense and can be produced,” she explains.
Once the user has created the structure of the manipulator, he can deform the components to adapt it to a specific task. Maybe the manipulator needs fingers with slimmer tips to manipulate office scissors, or curved fingers to grab bottles.
In this deformation phase, the software surrounds each component with a digital cage. Users stretch or bend the components by pulling on the corners of the cage. The system automatically restricts these movements to ensure that the parts are still properly connected and that the finished design remains manufactured.
Fits like a glove
After customization, the user sets the locations of the tactile sensors. These sensors will be integrated into a knitted glove that fits securely around the 3D-printed robot manipulator. The glove consists of two layers of fabric, one with horizontal piezoelectric fibers and one with vertical fibers. Piezoelectric material produces an electrical signal when it is compressed. Where the horizontal and vertical piezoelectric fibers intersect, tactile sensors are created that convert pressure stimuli into measurable electrical signals.
“We used gloves because they are easy to install, easy to replace and easy to take off if we need to repair anything inside them,” explains Zlokapa.
The gloves also allow the user to cover the entire hand with tactile sensors instead of embedding them in the palm or fingers, as other robotic manipulators do (if they have tactile sensors at all).
After completing the design interface, the researchers tailored manipulators for four complex tasks: picking up an egg, cutting paper with scissors, pouring water from a bottle and screwing in a wing nut. For example, the wing nut manipulator had an extended and staggered finger that prevented the finger from to collide with the nut while it was turning. Only two iterations were needed for this successful design.
The egg-grabbing manipulator never broke off or dropped the egg during the test, and the paper-cutting manipulator could use a wider selection of scissors than any other robot hands they could find in the literature.
However, when testing the manipulators, the researchers found that the sensors generate a lot of noise due to the uneven interweaving of the knitted fibers, which affects their accuracy. They are now working on more reliable sensors that can improve the manipulator’s performance.
The researchers also want to investigate the use of additional automation. Since the rules of graph grammar are written in a way that a computer can understand, algorithms could search the design space to determine optimal configurations for a task-specific robot hand. With autonomous manufacturing, the entire prototyping process could be performed without human intervention, Zlokapa says.
“Now that we have a computer that can explore this design space, we can answer the question: is the human hand the optimal form for everyday tasks? Maybe there is a better form? Or maybe we want more or fewer fingers, or fingers pointing in different directions? This research does not provide a complete answer to that question, but it is a step in that direction, “she says.
“The work presents an intriguing idea and an elegant system design,” said Wenzhen Yuan, an assistant professor at Carnegie Mellon University’s Robotics Institute who was not involved in this research. “It provides a new way of thinking about robot design in this new era, where robot adaptation and versatility are the key. It bridges the gap between mechanical design, computer graphics and computer-aided manufacturing. I see many applications to the system and great potential in methodology.”
This work was supported in part by the Toyota Research Institute, the Defense Advanced Research Projects Agency, and an Amazon Robotics Research Award.