Researchers at MIT have developed RF-Grasp, a robot that can sense hidden objects with ‘superhuman perception’.
A research team at MIT has developed RF-Grasp, a robot that uses penetrative radio frequency to pinpoint items even when they’re hidden from view.
This is because radio waves can pass through walls and sense objects hidden behind them. RF-Grasp uses radio waves alongside more traditional computer vision techniques to locate and grasp such items.
According to associate professor and director of the Signal Kinetics Group at the MIT Media Lab, Fadel Adib, the goal of this development is to give robots “superhuman perception”.
It could potentially be used to streamline e-commerce fulfilment in warehouses or help machines pick out a screwdriver from a toolkit. Warehouse work is still largely carried out by humans because robots struggle to locate and grasp objects in crowded environments.
“Perception and picking are two roadblocks in the industry today,” said Prof Alberto Rodriguez from MIT’s Department of Mechanical Engineering. “Radio frequency is such a different sensing modality than vision. It would be a mistake not to explore what radio frequency can do.”
Such robots could even have applications in the home, according to Adib, by locating the right Allen key while you assemble Ikea furniture. “Or you could imagine the robot finding lost items. It’s like a super-Roomba that goes and retrieves my keys, wherever the heck I put them.”
How RF-Grasp works
Like other radio frequency identification systems, RF-Grasp consists of a reader and a tag. The latter is a tiny computer chip that is attached to the object you want to track. In the case of pets, for example, the chip is implanted.
The reader emits a radio frequency signal that gets modulated by the tag and reflected back to the reader, providing information on the tagged object’s identity and location.
With RF-Grasp, both a camera and a radio frequency reader help it find and grab tagged items. It has a robotic arm with a grasping hand and the camera sits on its wrist. It constantly collects both radio frequency tracking data and a visual picture of its surroundings. It then integrates the two data streams into its decision-making process.
Tara Boroushaki, a research assistant in the Signal Kinetics Group, is lead author of a paper on this research.“The robot has to decide, at each point in time, which of these streams is more important to think about,” she explained. “It’s not just eye-hand coordination, it’s radio frequency-eye-hand coordination. So, the problem gets very complicated.”
Adib added: “It starts by using radio frequency to focus the attention of vision. Then you use vision to navigate fine manoeuvres.” He compared it to hearing a siren from behind and looking to get a clearer picture of where it is coming from.
During testing, RF-Grasp was able to pinpoint and grab target objects with around half as much total movement as its non-radio frequency equivalents. It was also uniquely able to declutter its surroundings, the team said, by removing packing materials and other obstacles to reach the tagged items.