From MIT News: Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have demonstrated an algorithm that has effectively learned how to predict sound: When shown a silent video clip of an object being hit, the algorithm can produce a sound for the hit that is realistic enough to fool human viewers.
This “Turing Test for sound” represents much more than just a clever computer trick: Researchers envision future versions of similar algorithms being used to automatically produce sound effects for movies and TV shows, as well as to help robots better understand objects’ properties... (full article)(full paper)
Underwater Robotics Competition Challenges Students to Tackle Ocean and Space Science and Exploration
AIAA and Drone World Expo Partner to Create New, Innovative Unmanned Aerial Systems Research Competition
Renesas Electronics Develops Two-Port On-Chip SRAM Specialized in Improving Video Processing Performance of Vehicles for the Autonomous-Driving Era
Records 466 to 480 of 2661