The goal of the DARPA Robotics Challenge (DRC) is to generate groundbreaking research and development so that future robotics can perform the most hazardous activities in future disaster response operations, in tandem with their human counterparts, in order to reduce casualties, avoid further destruction, and save lives.
Disaster response robots require multiple layers of software to explore and interact with their environments, use tools, maintain balance and communicate with human operators. In the Virtual Robotics Challenge (VRC), competing teams applied software of their own design to a simulated robot in an attempt to complete a series of tasks that are prerequisites for more complex activities.
Twenty-six teams from eight countries qualified to compete in the VRC, which ran from June 17-21, 2013. DARPA had allocated resources for the six teams that did best, but in an interesting twist, good sportsmanship and generosity will allow members of the top nine teams, listed below, to move forward:
- Team IHMC, Institute for Human and Machine Cognition, Pensacola, Fla. (52 points)
- WPI Robotics Engineering C Squad (WRECS), Worcester Polytechnic Institute, Worcester, Mass. (39 points)
- MIT, Massachusetts Institute of Technology, Cambridge, Mass. (34 points)
- Team TRACLabs, TRACLabs, Inc., Webster, Texas (30 points)
- JPL / UCSB / Caltech, Jet Propulsion Laboratory, Pasadena, Calif. (29 points)
- TORC, TORC / TU Darmstadt / Virginia Tech, Blacksburg, Va. (27 points)
- Team K, Japan (25 points)
- TROOPER, Lockheed Martin, Cherry Hill, N.J. (24 points)
- Case Western University, Cleveland, Ohio (23 points)
You can apply to take part in the Kinect for Windows developer kit program. This program, which begins in November 2013, will provide developers with tools and a pre-release sensor as soon as possible so they can start building new applications before general availability in 2014.
The program fee will be US$399 (or local equivalent) and offers the following benefits:
- Direct access to the Kinect for Windows engineering team via a private forum and exclusive webcasts
- Early SDK access (alpha, beta, and any updates along the way to release)
- Private access to all API and sample documentation
- A pre-release/alpha sensor
- A final, released sensor at launch
There are a limited number of spots in the program. Applications must be completed by July 31, 2013, 9:00 A.M. (Pacific Time). Apply here.
For broad information about the new Kinect check out Wired's first look video.
If you really want something heavy for Monday's breakfast---below is a documentary about the Russian soldiers, known as "biorobots", that sealed the reactor manually for the most part.
- Atmel AVR Microcontroller: Compatible with Arduino so you can re-flash with your own firmware using the on-board bootloader
- ZigBee-Capable Radio: Communicate wirelessly with an 802.15.4-compliant radio, create mesh networks, control and monitor remotely
- Multi-Color (RGB) LED: Select from a full spectrum of colors
- 3-Axis Accelerometer: Detect free-falls, bumps, tilt angles
- Buzzer: Play notes or complete tunes, give audio responses to inputs
- RJ11 (6P4C) Expansion Connector: Use a standard phone cable to connect our Bluetooth/breakout boards or your own electronics to your Linkbot's power and I2C bus
- 3x Buttons - Easily control Linkbot modes and functions or write custom functions for button presses
- Micro-USB Connector: Connect to a computer or charger with a standard Micro-USB cable
- Rechargeable Lithium-Ion Battery: Run your Linkbot for over 3 hours for most applications before having to charge
- High Torque:Weight-ratio Motors: Light-but-strong motors produce up to 100oz-in of torque
- Absolute Encoding: Precisely control and measure speeds and angles down to 0.5 degrees
- BaroboLink Software: Graphical interface lets you run programs, actuate motors and read sensors on your computer
- Polycarbonate Housing: Super-durable, drop-tested from second-story building (not recommended) so it can handle your demanding projects
- SnapConnector Mounting Surfaces: Quickly connect and remove wheels, connecting plates, grabbers, even multiple Linkbots; or connect your own accessories with standard screws
Arduino Yún is the combination of a classic Arduino Leonardo (based on the Atmega32U4 processor) with a Wifi system-on-a-chip running Linino (a MIPS GNU/Linux based on OpenWRT). It’s based on the ATMega32u4 microcontroller and on the Atheros AR9331, a system on a chip running Linino, a customized version of OpenWRT, the most used Linux distribution for embedded devices.
Available at the end of June for $69.
MCU – Atmel ATMega32u4 @ 16 MHz (same as the one used in Leonardo board) with 2.5KB SRAM and 32KB flash
SoC – Atheros AR9331 MIPS-based Wi-Fi SoC running Linino, Arduino’s own Linux distribution based on OpenWRT. It’s the same chipset as in TP-Link WR703N router.
Storage – microSD card slot
USB – micro USB connector + full USB host port
Connectivity – Ethernet + Wi-Fi
14 digital input/output pins (of which 7 can be used as PWM outputs and 12 as analog inputs)
The Arduino Robot is the first official Arduino on wheels. The robot has two processors, one on each of its two boards. The Motor Board controls the motors, and the Control Board reads sensors and decides how to operate. Each of the boards is a full Arduino board programmable using the Arduino IDE.
Both Motor and Control boards are microcontroller boards based on the ATmega32u4. The Robot has many of its pins mapped to on-board sensors and actuators.
Programming the robot is similar to the process with the Arduino Leonardo. Both processors have built-in USB communication, eliminating the need for a secondary processor. This allows the Robot to appear to a connected computer as a virtual (CDC) serial / COM port.
As always with Arduino, every element of the platform – hardware, software and documentation – is freely available and open-source.
On sale at the Maker Faire in San Mateo (May 17-19) and available online starting in July.
GOOGLE I/O 2013
The Moscone Center, San Francisco
Makr Shakr is a new robotic bartending system that allows users to create, in real-time, personalized cocktail recipes through a smart phone application and transform them into crowd-sourced drink combinations. The cocktail creation is assembled by three robotic arms, whose movements - visualized on a large display positioned behind the bar - mimic the actions of a bartender, from the shaking of a martini to the thin slicing of a lemon garnish. The system explores the new dynamics of social creation and consumption - ‘design, make and enjoy’ - and in just the time needed to prepare a new cocktail.
Willow Garage is proud to announce the initial release of MoveIt! : new software targeted at allowing you to build advanced applications integrating motion planning, kinematics, collision checking with grasping, manipulation, navigation, perception, and control. MoveIt! is robot agnostic software that can be quickly set up with your robot if a URDF representation of the robot is available. The MoveIt! Setup Assistant lets you configure MoveIt! for any robot, allowing you to visualize and interact with the robot model quickly.
MoveIt! can incorporate both actual sensor data and simulated models to build an environment representation. Sensor information (3D) can be automatically integrated realtime in the representation of the world that MoveIt! maintains. CAD models can also be imported in the same world representation if desired. Collision-free motion planning, execution and monitoring are core capabilities that MoveIt! provides for any robot. MoveIt! updates its representation of the environment on the fly, enabling reactive motion planning and execution, which is essential for applications in human-robot collaborative environments.
MoveIt! interfaces with controllers through a standard ROS interface, allowing for ease of inter-operability, i.e. the ability to use the same higher-level software with a variety of robots without needing to change code. MoveIt! is architected to be flexible, using a plugin architecture to allow users to integrate their own custom components while still providing out-of-the-box functionality using default implementations. Furthermore, the ROS communication and configuration layer of MoveIt! is separated from core computational components such as motion planning or collision checking, the latter components being provided separately as C++ libraries.
Roboteq, Inc launched a kickstarter project named RIO (for Raspberry IO) and aimed at creating an intelligent I/O card that stacks over the $35 Raspberry PI Linux Single Board computer.
Power for the PI from any DC source
RIO includes a 3A DC/DC converter that may be connected to a 10V to 40V DC supply, and generates the 5V needed by the PI and the RIO cards.
21 I/O lines to Connect Just About Anything
RIO provides a total of 8 digital outputs rated up to 1A and 30V max, which may also be used as digital inputs.
The card includes 13 inputs, each of which can be configured as a digital input, 0-5V analog input with 12-bit resolution, or as a timer input. In the timer mode, the inputs can capture pulse width, frequency, quadrature encoder counts, or duty cycle. Most of the input pins can also be configured as PWM outputs for driving RC servos, or dimmable lights.
Serial Connectivity and CAN Networking
Two serial ports are present on the card. One is fully RS232 compliant with programmable baud rate up to 115200 bits/s for connection to motor controllers, scanners, PC or any other RS232 device. The second is RS485 compatible, enabling, among other things, DMX512 connectivity to light show equipement. Optionally, a 3rd serial port uses TTL levels for direct interface to non-buffered, non-inverted USARTs as these found on most microcontrollers, like the Arduino.
A CAN bus interface is also present on the Rio card for connecting to CAN-compatible device, on a low cost twisted pair network at speeds up to 1Mbit/s.
Full Kickstarter details here.
The ARM-H track of DARPA's Autonomous Robotic Manipulation (ARM) program focuses on development of robust, low-cost and dexterous robotic hand hardware. DARPA funded performers to design and build hand mechanisms that could replace the claw-like hands currently used on robots with hands incorporating 3-4 fingers and useable palms. The teams successfully produced hands that can be manufactured for as little as $3,000 per unit (in batches of 1,000 or more), down from the $50,000 cost of current technology. The new hands also incorporate sufficient dexterity to enable manipulation of objects in their fingers when controlled by a skilled operator.
Demonstration of the first controlled flight of an insect-sized robot is the culmination of more than a decade's work, led by researchers at the Harvard School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard. Half the size of a paperclip, weighing less than a tenth of a gram, the robot was inspired by the biology of a fly, with submillimeter-scale anatomy and two wafer-thin wings that flap almost invisibly, 120 times per second.
IEEE Spectrum has a short article about how the Italian Institute of Technology and the Swiss Federal Institute of Technology are using motion-capture from horses walking, trotting, etc and transferring it to the locomotion of their quadruped robots.
Records 436 to 450 of 631