Artificial Intelligence Produces Realistic Sounds That Fool Humans

From MIT News:  Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world.  Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have demonstrated an algorithm that has effectively learned how to predict sound: When shown a silent video clip of an object being hit, the algorithm can produce a sound for the hit that is realistic enough to fool human viewers.

This “Turing Test for sound” represents much more than just a clever computer trick: Researchers envision future versions of similar algorithms being used to automatically produce sound effects for movies and TV shows, as well as to help robots better understand objects’ properties... (full article)(full paper)

 

Featured Product

3MP HDR IP69K Camera for Robotics & Autonomous Vehicles

3MP HDR IP69K Camera for Robotics & Autonomous Vehicles

STURDeCAM31 from e-con Systems® is designed to make robotics and autonomous vehicles safer and more reliable. Powered by the Sony® ISX031 sensor and featuring GMSL2 interface, this compact 3MP camera delivers 120dB HDR + LFM imaging with zero motion blur — even in the most challenging outdoor conditions. Engineered to automotive-grade standards, STURDeCAM31 is IP69K certified, making it resistant to dust, water, vibration, and extreme temperatures. With support for up to 8 synchronized cameras, it enables powerful surround-view and bird's eye systems on NVIDIA® Jetson AGX Orin™.