. 24/7 Space News .
ROBO SPACE
Robot overcomes uncertainty to retrieve buried objects
by Adam Zewe for MIT News
Boston MA (SPX) Jun 29, 2022

Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the MIT Media Lab (far left) with (from left to right) Tara Boroushaki, Nazish Naeem, and Laura Dodds, research assistants in the Signal Kinetics group.

For humans, finding a lost wallet buried under a pile of items is pretty straightforward - we simply remove things from the pile until we find the wallet. But for a robot, this task involves complex reasoning about the pile and objects in it, which presents a steep challenge.

MIT researchers previously demonstrated a robotic arm that combines visual information and radio frequency (RF) signals to find hidden objects that were tagged with RFID tags (which reflect signals sent by an antenna). Building off that work, they have now developed a new system that can efficiently retrieve any object buried in a pile. As long as some items in the pile have RFID tags, the target item does not need to be tagged for the system to recover it.

The algorithms behind the system, known as FuseBot, reason about the probable location and orientation of objects under the pile. Then FuseBot finds the most efficient way to remove obstructing objects and extract the target item. This reasoning enabled FuseBot to find more hidden items than a state-of-the-art robotics system, in half the time.

This speed could be especially useful in an e-commerce warehouse. A robot tasked with processing returns could find items in an unsorted pile more efficiently with the FuseBot system, says senior author Fadel Adib, associate professor in the Department of Electrical Engineering and Computer Science and director of the Signal Kinetics group in the Media Lab.

"What this paper shows, for the first time, is that the mere presence of an RFID-tagged item in the environment makes it much easier for you to achieve other tasks in a more efficient manner. We were able to do this because we added multimodal reasoning to the system - FuseBot can reason about both vision and RF to understand a pile of items," adds Adib.

Joining Adib on the paper are research assistants Tara Boroushaki, who is the lead author; Laura Dodds; and Nazish Naeem. The research will be presented at the Robotics: Science and Systems conference.

Targeting tags
A recent market report indicates that more than 90 percent of U.S. retailers now use RFID tags, but the technology is not universal, leading to situations in which only some objects within piles are tagged.

This problem inspired the group's research.

With FuseBot, a robotic arm uses an attached video camera and RF antenna to retrieve an untagged target item from a mixed pile. The system scans the pile with its camera to create a 3D model of the environment. Simultaneously, it sends signals from its antenna to locate RFID tags. These radio waves can pass through most solid surfaces, so the robot can "see" deep into the pile. Since the target item is not tagged, FuseBot knows the item cannot be located at the exact same spot as an RFID tag.

Algorithms fuse this information to update the 3D model of the environment and highlight potential locations of the target item; the robot knows its size and shape. Then the system reasons about the objects in the pile and RFID tag locations to determine which item to remove, with the goal of finding the target item with the fewest moves.

It was challenging to incorporate this reasoning into the system, says Boroushaki.

The robot is unsure how objects are oriented under the pile, or how a squishy item might be deformed by heavier items pressing on it. It overcomes this challenge with probabilistic reasoning, using what it knows about the size and shape of an object and its RFID tag location to model the 3D space that object is likely to occupy.

As it removes items, it also uses reasoning to decide which item would be "best" to remove next.

"If I give a human a pile of items to search, they will most likely remove the biggest item first to see what is underneath it. What the robot does is similar, but it also incorporates RFID information to make a more informed decision. It asks, 'How much more will it understand about this pile if it removes this item from the surface?'" Boroushaki says.

After it removes an object, the robot scans the pile again and uses new information to optimize its strategy.

Retrieval results
This reasoning, as well as its use of RF signals, gave FuseBot an edge over a state-of-the-art system that used only vision. The team ran more than 180 experimental trials using real robotic arms and piles with household items, like office supplies, stuffed animals, and clothing. They varied the sizes of piles and number of RFID-tagged items in each pile.

FuseBot extracted the target item successfully 95 percent of the time, compared to 84 percent for the other robotic system. It accomplished this using 40 percent fewer moves, and was able to locate and retrieve targeted items more than twice as fast.

"We see a big improvement in the success rate by incorporating this RF information. It was also exciting to see that we were able to match the performance of our previous system, and exceed it in scenarios where the target item didn't have an RFID tag," Dodds says.

FuseBot could be applied in a variety of settings because the software that performs its complex reasoning can be implemented on any computer - it just needs to communicate with a robotic arm that has a camera and antenna, Boroushaki adds.

In the near future, the researchers are planning to incorporate more complex models into FuseBot so it performs better on deformable objects. Beyond that, they are interested in exploring different manipulations, such as a robotic arm that pushes items out of the way. Future iterations of the system could also be used with a mobile robot that searches multiple piles for lost objects.

This work was funded, in part, by the National Science Foundation, a Sloan Research Fellowship, NTT DATA, Toppan, Toppan Forms, and the MIT Media Lab.

Research Report:"FuseBot: RF-Visual Mechanical Search"


Related Links
Massachusetts Institute of Technology
All about the robots on Earth and beyond!


Thanks for being there;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Monthly Supporter
$5+ Billed Monthly


paypal only
SpaceDaily Contributor
$5 Billed Once


credit card or paypal


ROBO SPACE
Third and fourth robotic arms feel like a part of the user's own body
Toyohashi, Japan (SPX) Jun 29, 2022
A research team with members from the University of Tokyo, Keio University and Toyohashi University of Technology have developed supernumerary robotic arms operated by the user's foot movements in a virtual environment. It has shown that users can feel the supernumerary robotic arms as a part of their own body (embodiment). The study contributes to the design of human augmentation system that can be used naturally and freely without cognitive effort, like the user's own body. Doctoral student Ken ... read more

Comment using your Disqus, Facebook, Google or Twitter login.



Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

ROBO SPACE
Rocket Lab's Lunar Photon completes 6th orbital raise preps for final Earth-escape burn

How scientist proposed a novel Kalman filter for target tracking in space

Rocket Lab launches CAPSTONE microsat to test new lunar orbit design for NASA

Northrop Grumman's Cygnus reboosts Space Station

ROBO SPACE
SES's C-band satellite launched onboard SpaceX Falcon 9

Virgin Orbit establishes sew Brazilian subsidiary; now licensed for launch operations in Alcantara

NASA completes Wet Dress Rehearsal, moves forward toward launch

Virgin Orbit on target for next launch window to open June 29

ROBO SPACE
My Favorite Martian Image: 'Enchanted' Rocks at Jezero Crater

Help NASA scientists find clouds on Mars

Digging into our new drill hole: Sols 3517-3518

NASA's Curiosity takes inventory of key life ingredient on Mars

ROBO SPACE
Chinese official says its Mars sample mission will beat NASA back to Earth

China's deep space exploration laboratory starts operation

Shenzhou XIV taikonauts to conduct 24 medical experiments in space

Shenzhou XIV astronauts transporting supplies into space station

ROBO SPACE
SES-22 set to launch on Falcon 9 June 29

Inmarsat report calls for enhanced debris mitigation and stronger regulations in space

Beyond Gravity launches its own start-up program "Launchpad"

A modern space race needs to be built on sustainability

ROBO SPACE
Quantum sensor can detect electromagnetic signals of any frequency

California passes sweeping law to reduce non-recyclable plastic

Single-atom tractor beams power chemical catalysis

GMV cements leadership in collision avoidance operations automation and coordination in Europe

ROBO SPACE
Long-term liquid water also on non-Earth-like planets

Ancient microbes may help us find extraterrestrial life forms

A novel crystal structure sheds light on the dynamics of extrasolar planets

UK Government takes leading role in new space telescope to explore exoplanets

ROBO SPACE
You can help scientists study the atmosphere on Jupiter

SwRI scientists identify a possible source for Charon's red cap

NASA's Europa Clipper Mission Completes Main Body of the Spacecraft

Gemini North Telescope Helps Explain Why Uranus and Neptune Are Different Colors









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.