Artificial Intelligence Produces Realistic Sounds That Fool Humans

From MIT News:  Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world.  Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have demonstrated an algorithm that has effectively learned how to predict sound: When shown a silent video clip of an object being hit, the algorithm can produce a sound for the hit that is realistic enough to fool human viewers.

This “Turing Test for sound” represents much more than just a clever computer trick: Researchers envision future versions of similar algorithms being used to automatically produce sound effects for movies and TV shows, as well as to help robots better understand objects’ properties... (full article)(full paper)

 

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Servo2Go - Low-cost Tin-Can Stepper Motors from Nippon Pulse feature high torque and compact size

Servo2Go - Low-cost Tin-Can Stepper Motors from Nippon Pulse feature high torque and compact size

The PF/PFC series tin-can stepping motors are conventional magnet-driven rotary stepper motors with a permanent magnet in their rotor core. Rotating in proportion to the number of pulses sent to the motor, the stepper motor is frequency synchronized and can change speed depending on the frequency of the pulse signal.