DIY Drones is an excellent community site for amateur unmanned vehicle development. Using Arduino as a foundation they have created the ArduPilotMega universal autopilot autopilot hardware. It combines sophisticated IMU-based autopilot electronics with free Arduino-based autopilot software that can turn any RC vehicle into a fully-autonomous UAV. The software is open source and currently has three variations: Arduplane, for any fixed-wing aircraft, Arducopter, for any rotary-wing aircraft, and ArduRover, for any ground or water-based vehicle. The site has an active forum as well as a store where you can purchase all the ArduPilot Mega Controller, other Arduino hardware or complete Drone kits.
STMicroelectronics has unveiled a smart suit prototype with sewn-in new multi-sensor ST iNEMO® motion co-processor that recognize complex movements of the wearer’s body and translate them to a digital model with outstanding precision and speed. The current-generation prototype of ST’s body-motion reconstruction suit demonstrates the optimal performance of the miniaturized iNEMO multi-sensor nodes attached on each arm, forearm, thigh, calf, and two on the back; additional nodes can be mounted on hands, on shoes or on the head. Tests with realistic, complex human body motions have proven the outstanding precision and speed of ST’s body-motion reconstruction suit, with deviation in spatial accuracy below 0.5 degrees during movements and the time needed to process and apply the sensor data to the skeleton model less than 15 milliseconds.
At CES Texas Instruments was showing off its new open source platform MAVRK. MAVRK is an acronym for Modular and Versatile Reference Kit. The MAVRK platform contains a motherboard, several MAVRK modules, and firmware to communicate between the modules. MAVRK modules are reference designs around TI silicon that will connect to the motherboard with a common footprint. With several modules connected, a user can configure multiple combinations of RF, AD/DA, transceivers, signal conditioning, and driver circuits as a system-level design. In the video one motherboard hosts the TCA8418 keypad scanner to sense user input. The MSP430F5438A on this motherboard decodes the keypad input for motor spin speed and the joystick direction for spin direction. It then beams this information to a second motherboard connected to robotic treads, with an MSP430F5137 processing the signals, and an MSP430F2274 controlling the signals to the DRV88414 motor driver. The DRV8814 drives the two motors that turn the robot treads.
Earlier this week Microsoft announced that they would officially be bringing their Kinect hardware to the Windows platform. The hardware is the mostly the same but new firmware allows the depth camera to see objects as close as 50cm away without losing accuracy or precision. Microsoft also says Kinect for Windows is 20% faster than it was in the last release and the accuracy rate of skeletal tracking and joint recognition have been substantially improved.
Microsoft has allowed the beta SDK to be used with Xbox Kinect and will continue to allow existing projects access to the SDK but they also state that all future projects will need to purchase the Kinect For Windows hardware in order to have access to upcoming SDK releases. Kinect For Windows and the SDK will cost $249 ($149 for an academic license).
Hyperspectral imaging, also called imaging spectroscopy, is a method of obtaining the spectral content of each pixel in a 2D image. The spectral data can be used into identify the chemical compounds or materials. Up till now hyperspectral imaging devices have been very expensive, starting at around $25,000 dollars. Engineers at the Vienna University of Technology and the University of Arizona have shown that they can perform CTIS spectral imaging using an unmodified consumer camera. The device they have developed can be used in a hyperspectral imaging mode that allows the spectral measurement of a whole image with up to 5-nm spectral resolution and 120 x 120-pixel spatial resolution and can be built for under $1000.
Swiss architects Gramazio & Kohler and Raffaello D’Andrea are a fully automated construction project at the FRAC Centre in Orléans, France that uses flying robots to assemble a six meter high tower constructed of 1500 polystyrene foam bricks. The exhibit lasts from up to February 19, 2012. The same team previously used a robot called "R-O-B" to build a looping wall in New York and the award-winning Structural Oscillations installation at the 2008 Venice Architecture Biennale.
Aldebaran Robotics just released a promo video for their next NAO robot. The new model includes 2 cameras, 4 microphones, sonar rangefinder, 2 IR emitters and receivers, 1 inertial board, 9 tactile sensors, and 8 pressure sensors. NAOqi, their proprietary embedded software, provides functionality for task such as speech recognition, object recognition, and access to all the sensors. Code development can take place in Windows, Mac OS, or Linux and be called from many languages, including C++, Python, Urbi, and .Net.
Humans are good at recognizing full facial expressions which present a rich source of affective information. However, psychological studies have shown that affect also manifests itself as micro-expressions. These are very rapid 1/3 to 1/25 second involuntary facial expressions which give a brief glimpse to feelings that people undergo but try not to express. Researchers at Oxford University and Oulu University are developing software that can recognize these ‘micro-expressions’. The initial experiments do indicate that the approach can distinguish deceptive from truthful micro-expressions, but further experiments need to be conducted to confirm it. The full paper is available here.
Professor George Whitesides, Robert Shepherd and their colleagues from Harvard University have designed a prototype soft, agile robot capable of crawling and squeezing under obstacles current rigid robots are can't handle. The flexible robot motion is controlled by a series of chambers within the 'elastomer' layer that can be inflated with compressed air. The air is fed through tubes attached to the robot.
FroboMind is a conceptual architecture for field robots. The idea is to use the same generic architecture on all field robots and hereby maximizing efficiency, reliability, modularity, extendability and code reuse. FroboMind is open source and is implemented in Robot Operating System. The project includes ASuBot, a research project focusing on weeding in organic orchards, Armadillo, a tracked toolcarrier within precision navigation in row crops, Casmobot, a semi-autonomous slope mower, and Hortibot, a tool carrier capable of traversing a field of row crops.
Boeing video showing its Unmanned Little Bird helicopter landing autonomously on a trailer moving along a runway at speeds up to 15 knots. The test was part of a program with France's Thales and DCNS to demonstrate technology for unmanned VTOL deck landings and take-offs on moving ships.
Here is a video of some of the highlights from The International Micro Air Vehicle Conference and Competition held in September. The competition part of the conference required the tiny autonomous flying robots perform missions in at an indoor and outdoor tasks like collecting objects from within a structure, popping balloons or dropping objects in specific locations. A summary of all the aircrafts that participated are available in this pdf.
Ideas in Action is a weekly PBS program hosted by Jim Glassman. A recent episode focused on the future of the American economy and the role of intelligent computers and robots will play. His two guests were Martin Ford, author of "The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future" and Dr. Robin Hanson, Associate Professor of Economics at George Mason University. The entire episode is available here.
Records 406 to 420 of 479