ROSCon 2012, the first ROS developer's conference is a weekend conference which will comprise of tech talks and tutorials that will introduce you to new ROS tools and libraries. The event takes place in St. Paul, Minnesota May 19-20, right after the 2012 IEEE international Conference on Robotics and Automation.
The Xtion Pro Live by ASUS is similar to Microsoft's Kinect hardware. The hardware includes a depth sensor, RGB camera and set of microphones. The device is aimed at development and is powered through the USB 2 connection rather than requiring an addition power connection like the Kinect. Its also slightly smaller than the Kinect. I Heart Robotics has received one and they have additional information here.
Currently its only available for purchase directly from ASUS and is currently on backorder.
PETMAN is an anthropomorphic robot for testing chemical protection clothing. Unlike previous suit testers, which had to be supported mechanically and had a limited repertoire of motion, PETMAN will balance itself and move freely; walking, crawling and doing a variety of suit-stressing calisthenics during exposure to chemical warfare agents.
Objet Ltd. with be demonstrating their newest 3D Printer at Wired Magazine's inaugural conference in London. The Objet260 Connex is able to use 60 different materials and it can simultaneously build 14 different materials into a single model part with a 16-micron print layer accuracy.
DigInfo.tv, a Tokyo-based website recently posted a video from a presentation by the Japan’s Ministry of Defense. The flying orb weighs 350 g, is 42 cm in diameter, and is made of commercially available parts costing a total of around US$1,400. The video below is from the public unveiling:
Odos imaging's 1.3 megapixel 2+3D camera can capture accurate 3D images at 100 frames per second; allowing the system to capture very fast moving objects without degradation even in the brightest sunlight. Combining proprietary technology with conventional 2D image capture, an Odos imaging solution provides unambiguous 3D images at video rates from a single unit. Very short, intense pulses of invisible light are used to illuminate the scene. The high intensity of the pulse minimizes the effect of ambient light and allows for outdoor operation. These pulses are reflected by objects within the scene and are detected by the image sensor. Proprietary algorithms convert the detected pulses into a distance measurement. Simultaneously, a conventional 2D image of the scene is captured. Each pixel on the sensor provides both distance and intensity information.
IEEE Spectrum has an article explaining how Google's new autonomous vehicles project works. The article is based on a recent presentation that Sebastian Thrun and Chris Urmson gave at keynote speech at the IEEE International Conference on Intelligent Robots and Systems. The article can be found here.
The IEEE International Conference on Intelligent Robots and Systems took place a few weeks ago in San Francisco. Willow Garage put together a nice montage video of some of the robots on display. Enjoy.
Torsten Kröger of Standford programmed a robot arm to play the block stacking game Jenga in order to demonstrate the potential of multi-sensor integration in industrial manipulation. The record height the robot was able to achieve was 28 stages, that is, ten additional stages consisting of 29 blocks that were put onto the top of the original tower.
Records 901 to 915 of 973