Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • About
Liwaiwai Liwaiwai
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
    • Architecture
    • Design
    • Software
    • Hybrid Cloud
    • Data
  • About
  • Robotics
  • Technology

Slender Robotic Finger Senses Buried Items

  • May 28, 2021
  • relay

MIT researchers developed a “Digger Finger” robot that digs through granular material, like sand and gravel, and senses the shapes of buried objects.Image courtesy of the researchers

Over the years, robots have gotten quite good at identifying objects — as long as they’re out in the open.

Discerning buried items in granular material like sand is a taller order. To do that, a robot would need fingers that were slender enough to penetrate the sand, mobile enough to wriggle free when sand grains jam, and sensitive enough to feel the detailed shape of the buried object.

MIT researchers have now designed a sharp-tipped robot finger equipped with tactile sensing to meet the challenge of identifying buried objects. In experiments, the aptly named Digger Finger was able to dig through granular media such as sand and rice, and it correctly sensed the shapes of submerged items it encountered. The researchers say the robot might one day perform various subterranean duties, such as finding buried cables or disarming buried bombs.

The research will be presented at the next International Symposium on Experimental Robotics. The study’s lead author is Radhen Patel, a postdoc in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). Co-authors include CSAIL PhD student Branden Romero, Harvard University PhD student Nancy Ouyang, and Edward Adelson, the John and Dorothy Wilson Professor of Vision Science in CSAIL and the Department of Brain and Cognitive Sciences.

Seeking to identify objects buried in granular material — sand, gravel, and other types of loosely packed particles — isn’t a brand new quest. Previously, researchers have used technologies that sense the subterranean from above, such as Ground Penetrating Radar or ultrasonic vibrations. But these techniques provide only a hazy view of submerged objects. They might struggle to differentiate rock from bone, for example.

Read More  This Horse-Riding Astronaut Is A Milestone In AI’s Journey To Make Sense Of The World

 

A closeup photograph of the new robot and a diagram of its parts. Image courtesy of the researchers

“So, the idea is to make a finger that has a good sense of touch and can distinguish between the various things it’s feeling,” says Adelson. “That would be helpful if you’re trying to find and disable buried bombs, for example.” Making that idea a reality meant clearing a number of hurdles.

The team’s first challenge was a matter of form: The robotic finger had to be slender and sharp-tipped.

In prior work, the researchers had used a tactile sensor called GelSight. The sensor consisted of a clear gel covered with a reflective membrane that deformed when objects pressed against it. Behind the membrane were three colors of LED lights and a camera. The lights shone through the gel and onto the membrane, while the camera collected the membrane’s pattern of reflection. Computer vision algorithms then extracted the 3D shape of the contact area where the soft finger touched the object. The contraption provided an excellent sense of artificial touch, but it was inconveniently bulky.

For the Digger Finger, the researchers slimmed down their GelSight sensor in two main ways. First, they changed the shape to be a slender cylinder with a beveled tip. Next, they ditched two-thirds of the LED lights, using a combination of blue LEDs and colored fluorescent paint. “That saved a lot of complexity and space,” says Ouyang. “That’s how we were able to get it into such a compact form.” The final product featured a device whose tactile sensing membrane was about 2 square centimeters, similar to the tip of a finger.

Read More  ABB’s Decade Of E-Mobility Leadership

With size sorted out, the researchers turned their attention to motion, mounting the finger on a robot arm and digging through fine-grained sand and coarse-grained rice. Granular media have a tendency to jam when numerous particles become locked in place. That makes it difficult to penetrate. So, the team added vibration to the Digger Finger’s capabilities and put it through a battery of tests.

“We wanted to see how mechanical vibrations aid in digging deeper and getting through jams,” says Patel. “We ran the vibrating motor at different operating voltages, which changes the amplitude and frequency of the vibrations.” They found that rapid vibrations helped “fluidize” the media, clearing jams and allowing for deeper burrowing — though this fluidizing effect was harder to achieve in sand than in rice.

They also tested various twisting motions in both the rice and sand. Sometimes, grains of each type of media would get stuck between the Digger-Finger’s tactile membrane and the buried object it was trying to sense. When this happened with rice, the trapped grains were large enough to completely obscure the shape of the object, though the occlusion could usually be cleared with a little robotic wiggling. Trapped sand was harder to clear, though the grains’ small size meant the Digger Finger could still sense the general contours of target object.

Patel says that operators will have to adjust the Digger Finger’s motion pattern for different settings “depending on the type of media and on the size and shape of the grains.” The team plans to keep exploring new motions to optimize the Digger Finger’s ability to navigate various media.

Read More  In Machine Learning, Synthetic Data Can Offer Real Performance Improvements

Adelson says the Digger Finger is part of a program extending the domains in which robotic touch can be used. Humans use their fingers amidst complex environments, whether fishing for a key in a pants pocket or feeling for a tumor during surgery. “As we get better at artificial touch, we want to be able to use it in situations when you’re surrounded by all kinds of distracting information,” says Adelson. “We want to be able to distinguish between the stuff that’s important and the stuff that’s not.”

Funding for this research was provided, in part, by the Toyota Research Institute through the Toyota-CSAIL Joint Research Center; the Office of Naval Research; and the Norwegian Research Council.

 

By Daniel Ackerman | MIT News Office
Source MIT News

relay

Related Topics
  • CSAIL
  • Digger Finger
  • GelSight
  • MIT
  • Toyota
You May Also Like
View Post
  • Artificial Intelligence
  • Technology

Unlocking The Secrets Of ChatGPT: Tips And Tricks For Optimizing Your AI Prompts

  • March 29, 2023
View Post
  • Artificial Intelligence
  • Technology

Try Bard And Share Your Feedback

  • March 29, 2023
View Post
  • Artificial Intelligence
  • Data
  • Data Science
  • Machine Learning
  • Technology

Google Data Cloud & AI Summit : In Less Than 12 Hours From Now

  • March 29, 2023
View Post
  • Artificial Intelligence
  • Technology

Talking Cars: The Role Of Conversational AI In Shaping The Future Of Automobiles

  • March 28, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

ChatGPT 4.0 Finally Gets A Joke

  • March 27, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

Mr. Cooper Is Improving The Home-buyer Experience With AI And ML

  • March 24, 2023
View Post
  • Artificial Intelligence
  • Machine Learning
  • Technology

GPT-4 : The Latest Milestone From OpenAI

  • March 24, 2023
View Post
  • Artificial Intelligence
  • Technology
  • Tools

Ditching Google: The 3 Search Engines That Use AI To Give Results That Are Meaningful

  • March 23, 2023

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay Connected!
LATEST
  • 1
    Unlocking The Secrets Of ChatGPT: Tips And Tricks For Optimizing Your AI Prompts
    • March 29, 2023
  • 2
    Try Bard And Share Your Feedback
    • March 29, 2023
  • 3
    Google Data Cloud & AI Summit : In Less Than 12 Hours From Now
    • March 29, 2023
  • 4
    Talking Cars: The Role Of Conversational AI In Shaping The Future Of Automobiles
    • March 28, 2023
  • 5
    Document AI Introduces Powerful New Custom Document Classifier To Automate Document Processing
    • March 28, 2023
  • 6
    How AI Can Improve Digital Security
    • March 27, 2023
  • 7
    ChatGPT 4.0 Finally Gets A Joke
    • March 27, 2023
  • 8
    Mr. Cooper Is Improving The Home-buyer Experience With AI And ML
    • March 24, 2023
  • 9
    My First Pull Request At Age 14
    • March 24, 2023
  • 10
    The 5 Podcasts To Check If You Want To Get Up To Speed On AI
    • March 24, 2023

about
About
Hello World!

We are liwaiwai.com. Created by programmers for programmers.

Our site aims to provide materials, guides, programming how-tos, and resources relating to artificial intelligence, machine learning and the likes.

We would like to hear from you.

If you have any questions, enquiries or would like to sponsor content, kindly reach out to us at:

[email protected]

Live long & prosper!
Most Popular
  • 1
    GPT-4 : The Latest Milestone From OpenAI
    • March 24, 2023
  • 2
    Ditching Google: The 3 Search Engines That Use AI To Give Results That Are Meaningful
    • March 23, 2023
  • 3
    Peacock: Tackling ML Challenges By Accelerating Skills
    • March 23, 2023
  • 4
    Coop Reduces Food Waste By Forecasting With Google’s AI And Data Cloud
    • March 23, 2023
  • 5
    Gods In The Machine? The Rise Of Artificial Intelligence May Result In New Religions
    • March 23, 2023
  • /
  • Artificial Intelligence
  • Machine Learning
  • Robotics
  • Engineering
  • About

Input your search keywords and press Enter.