Artificial Intelligence Produces Realistic Sounds That Fool Humans

From MIT News:  Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world.  Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have demonstrated an algorithm that has effectively learned how to predict sound: When shown a silent video clip of an object being hit, the algorithm can produce a sound for the hit that is realistic enough to fool human viewers.

This “Turing Test for sound” represents much more than just a clever computer trick: Researchers envision future versions of similar algorithms being used to automatically produce sound effects for movies and TV shows, as well as to help robots better understand objects’ properties... (full article)(full paper)

 

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

ElectroCraft's Motion Control for Mobile Robots

ElectroCraft's Motion Control for Mobile Robots

ElectroCraft is showcasing its award-winning mobile robot technology including their powerful and compact wheel drives, high-torque-density brushless DC motors, precision linear actuators as well as servo motor drive technology at a variety of conferences and tradeshows including the Boston Robotics Summit. Robotics Summit is the premier symposium for the sharing of ideas, technology, and market developments for robotic technologies across industries. Beyond a showcase and pitch of product, ElectroCraft is eager to participate in the collaborative discussion of challenges and opportunities that will shape the near and long-term robotic marketplace.