Earlier this week Microsoft announced that they would officially be bringing their Kinect hardware to the Windows platform. The hardware is the mostly the same but new firmware allows the depth camera to see objects as close as 50cm away without losing accuracy or precision. Microsoft also says Kinect for Windows is 20% faster than it was in the last release and the accuracy rate of skeletal tracking and joint recognition have been substantially improved.
Microsoft has allowed the beta SDK to be used with Xbox Kinect and will continue to allow existing projects access to the SDK but they also state that all future projects will need to purchase the Kinect For Windows hardware in order to have access to upcoming SDK releases. Kinect For Windows and the SDK will cost $249 ($149 for an academic license).
The consumer electronics show CES is this week so we are probably going to see a couple new 3D printers announced. MakerBot has been teasing a new version of their Thing-O-Matic and today 3D@Home announced their Cube printer. The printer will cost $1,299 and print standard .STL files to print out ABS plastic models. 3D@Home also plans to offer a print on demand service for larger models.
Hyperspectral imaging, also called imaging spectroscopy, is a method of obtaining the spectral content of each pixel in a 2D image. The spectral data can be used into identify the chemical compounds or materials. Up till now hyperspectral imaging devices have been very expensive, starting at around $25,000 dollars. Engineers at the Vienna University of Technology and the University of Arizona have shown that they can perform CTIS spectral imaging using an unmodified consumer camera. The device they have developed can be used in a hyperspectral imaging mode that allows the spectral measurement of a whole image with up to 5-nm spectral resolution and 120 x 120-pixel spatial resolution and can be built for under $1000.
Swiss architects Gramazio & Kohler and Raffaello D’Andrea are a fully automated construction project at the FRAC Centre in Orléans, France that uses flying robots to assemble a six meter high tower constructed of 1500 polystyrene foam bricks. The exhibit lasts from up to February 19, 2012. The same team previously used a robot called "R-O-B" to build a looping wall in New York and the award-winning Structural Oscillations installation at the 2008 Venice Architecture Biennale.
Aldebaran Robotics just released a promo video for their next NAO robot. The new model includes 2 cameras, 4 microphones, sonar rangefinder, 2 IR emitters and receivers, 1 inertial board, 9 tactile sensors, and 8 pressure sensors. NAOqi, their proprietary embedded software, provides functionality for task such as speech recognition, object recognition, and access to all the sensors. Code development can take place in Windows, Mac OS, or Linux and be called from many languages, including C++, Python, Urbi, and .Net.
Humans are good at recognizing full facial expressions which present a rich source of affective information. However, psychological studies have shown that affect also manifests itself as micro-expressions. These are very rapid 1/3 to 1/25 second involuntary facial expressions which give a brief glimpse to feelings that people undergo but try not to express. Researchers at Oxford University and Oulu University are developing software that can recognize these ‘micro-expressions’. The initial experiments do indicate that the approach can distinguish deceptive from truthful micro-expressions, but further experiments need to be conducted to confirm it. The full paper is available here.
Records 1231 to 1245 of 1327