The rover NASA will send to Mars in 2020 should look for signs of past life, collect samples for possible future return to Earth, and demonstrate technology for future human exploration of the Red Planet, according to a report provided to the agency.
The 154-page document was prepared by the Mars 2020 Science Definition Team, which NASA appointed in January to outline scientific objectives for the mission. The team, composed of 19 scientists and engineers from universities and research organizations, proposed a mission concept that could accomplish several high-priority planetary science goals and be a major step in meeting President Obama's challenge to send humans to Mars in the 2030s.
"Crafting the science and exploration goals is a crucial milestone in preparing for our next major Mars mission," said John Grunsfeld, NASA's associate administrator for science in Washington. "The objectives determined by NASA with the input from this team will become the basis later this year for soliciting proposals to provide instruments to be part of the science payload on this exciting step in Mars exploration."
NASA will conduct an open competition for the payload and science instruments. They will be placed on a rover similar to Curiosity, which landed on Mars almost a year ago. Using Curiosity's design will help minimize mission costs and risks and deliver a rover that can accomplish the mission objectives.
The team's report details how the rover would use its instruments for visual, mineralogical and chemical analysis down to microscopic scale to understand the environment around its landing site and identify biosignatures, or features in the rocks and soil that could have been formed biologically.
"The Mars 2020 mission concept does not presume that life ever existed on Mars," said Jack Mustard, chairman of the Science Definition Team and a professor at the Geological Sciences at Brown University in Providence, R.I.
"However, given the recent Curiosity findings, past Martian life seems possible, and we should begin the difficult endeavor of seeking the signs of life. No matter what we learn, we would make significant progress in understanding the circumstances of early life existing on Earth and the possibilities of extraterrestrial life."
"The Mars 2020 mission will provide a unique capability to address the major questions of habitability and life in the solar system," said Jim Green, director of NASA's Planetary Science Division in Washington. "This mission represents a major step towards creating high-value sampling and interrogation methods, as part of a broader strategy for sample returns by planetary missions."
From Stolidus Simulations:
Robot Vacuum Simulator 2013 is a groundbreaking simulator taking place in the incredible world of Robot Vacuum cleaners.
The simulator puts you in the shoes of a Robot Vacuum cleaner and sends you on a journey through an appartment cleaning up the dust of man.
- The most realistic robot vacuum simulator ever.
- Incredible single-player simulation
- Duel with your friends in 2 player mode
- A fully open world
- Fantastic music
- A main menu
The goal of the DARPA Robotics Challenge (DRC) is to generate groundbreaking research and development so that future robotics can perform the most hazardous activities in future disaster response operations, in tandem with their human counterparts, in order to reduce casualties, avoid further destruction, and save lives.
Disaster response robots require multiple layers of software to explore and interact with their environments, use tools, maintain balance and communicate with human operators. In the Virtual Robotics Challenge (VRC), competing teams applied software of their own design to a simulated robot in an attempt to complete a series of tasks that are prerequisites for more complex activities.
Twenty-six teams from eight countries qualified to compete in the VRC, which ran from June 17-21, 2013. DARPA had allocated resources for the six teams that did best, but in an interesting twist, good sportsmanship and generosity will allow members of the top nine teams, listed below, to move forward:
- Team IHMC, Institute for Human and Machine Cognition, Pensacola, Fla. (52 points)
- WPI Robotics Engineering C Squad (WRECS), Worcester Polytechnic Institute, Worcester, Mass. (39 points)
- MIT, Massachusetts Institute of Technology, Cambridge, Mass. (34 points)
- Team TRACLabs, TRACLabs, Inc., Webster, Texas (30 points)
- JPL / UCSB / Caltech, Jet Propulsion Laboratory, Pasadena, Calif. (29 points)
- TORC, TORC / TU Darmstadt / Virginia Tech, Blacksburg, Va. (27 points)
- Team K, Japan (25 points)
- TROOPER, Lockheed Martin, Cherry Hill, N.J. (24 points)
- Case Western University, Cleveland, Ohio (23 points)
You can apply to take part in the Kinect for Windows developer kit program. This program, which begins in November 2013, will provide developers with tools and a pre-release sensor as soon as possible so they can start building new applications before general availability in 2014.
The program fee will be US$399 (or local equivalent) and offers the following benefits:
- Direct access to the Kinect for Windows engineering team via a private forum and exclusive webcasts
- Early SDK access (alpha, beta, and any updates along the way to release)
- Private access to all API and sample documentation
- A pre-release/alpha sensor
- A final, released sensor at launch
There are a limited number of spots in the program. Applications must be completed by July 31, 2013, 9:00 A.M. (Pacific Time). Apply here.
For broad information about the new Kinect check out Wired's first look video.
If you really want something heavy for Monday's breakfast---below is a documentary about the Russian soldiers, known as "biorobots", that sealed the reactor manually for the most part.
- Atmel AVR Microcontroller: Compatible with Arduino so you can re-flash with your own firmware using the on-board bootloader
- ZigBee-Capable Radio: Communicate wirelessly with an 802.15.4-compliant radio, create mesh networks, control and monitor remotely
- Multi-Color (RGB) LED: Select from a full spectrum of colors
- 3-Axis Accelerometer: Detect free-falls, bumps, tilt angles
- Buzzer: Play notes or complete tunes, give audio responses to inputs
- RJ11 (6P4C) Expansion Connector: Use a standard phone cable to connect our Bluetooth/breakout boards or your own electronics to your Linkbot's power and I2C bus
- 3x Buttons - Easily control Linkbot modes and functions or write custom functions for button presses
- Micro-USB Connector: Connect to a computer or charger with a standard Micro-USB cable
- Rechargeable Lithium-Ion Battery: Run your Linkbot for over 3 hours for most applications before having to charge
- High Torque:Weight-ratio Motors: Light-but-strong motors produce up to 100oz-in of torque
- Absolute Encoding: Precisely control and measure speeds and angles down to 0.5 degrees
- BaroboLink Software: Graphical interface lets you run programs, actuate motors and read sensors on your computer
- Polycarbonate Housing: Super-durable, drop-tested from second-story building (not recommended) so it can handle your demanding projects
- SnapConnector Mounting Surfaces: Quickly connect and remove wheels, connecting plates, grabbers, even multiple Linkbots; or connect your own accessories with standard screws
Arduino Yún is the combination of a classic Arduino Leonardo (based on the Atmega32U4 processor) with a Wifi system-on-a-chip running Linino (a MIPS GNU/Linux based on OpenWRT). It’s based on the ATMega32u4 microcontroller and on the Atheros AR9331, a system on a chip running Linino, a customized version of OpenWRT, the most used Linux distribution for embedded devices.
Available at the end of June for $69.
MCU – Atmel ATMega32u4 @ 16 MHz (same as the one used in Leonardo board) with 2.5KB SRAM and 32KB flash
SoC – Atheros AR9331 MIPS-based Wi-Fi SoC running Linino, Arduino’s own Linux distribution based on OpenWRT. It’s the same chipset as in TP-Link WR703N router.
Storage – microSD card slot
USB – micro USB connector + full USB host port
Connectivity – Ethernet + Wi-Fi
14 digital input/output pins (of which 7 can be used as PWM outputs and 12 as analog inputs)
The Arduino Robot is the first official Arduino on wheels. The robot has two processors, one on each of its two boards. The Motor Board controls the motors, and the Control Board reads sensors and decides how to operate. Each of the boards is a full Arduino board programmable using the Arduino IDE.
Both Motor and Control boards are microcontroller boards based on the ATmega32u4. The Robot has many of its pins mapped to on-board sensors and actuators.
Programming the robot is similar to the process with the Arduino Leonardo. Both processors have built-in USB communication, eliminating the need for a secondary processor. This allows the Robot to appear to a connected computer as a virtual (CDC) serial / COM port.
As always with Arduino, every element of the platform – hardware, software and documentation – is freely available and open-source.
On sale at the Maker Faire in San Mateo (May 17-19) and available online starting in July.
GOOGLE I/O 2013
The Moscone Center, San Francisco
Makr Shakr is a new robotic bartending system that allows users to create, in real-time, personalized cocktail recipes through a smart phone application and transform them into crowd-sourced drink combinations. The cocktail creation is assembled by three robotic arms, whose movements - visualized on a large display positioned behind the bar - mimic the actions of a bartender, from the shaking of a martini to the thin slicing of a lemon garnish. The system explores the new dynamics of social creation and consumption - ‘design, make and enjoy’ - and in just the time needed to prepare a new cocktail.
Willow Garage is proud to announce the initial release of MoveIt! : new software targeted at allowing you to build advanced applications integrating motion planning, kinematics, collision checking with grasping, manipulation, navigation, perception, and control. MoveIt! is robot agnostic software that can be quickly set up with your robot if a URDF representation of the robot is available. The MoveIt! Setup Assistant lets you configure MoveIt! for any robot, allowing you to visualize and interact with the robot model quickly.
MoveIt! can incorporate both actual sensor data and simulated models to build an environment representation. Sensor information (3D) can be automatically integrated realtime in the representation of the world that MoveIt! maintains. CAD models can also be imported in the same world representation if desired. Collision-free motion planning, execution and monitoring are core capabilities that MoveIt! provides for any robot. MoveIt! updates its representation of the environment on the fly, enabling reactive motion planning and execution, which is essential for applications in human-robot collaborative environments.
MoveIt! interfaces with controllers through a standard ROS interface, allowing for ease of inter-operability, i.e. the ability to use the same higher-level software with a variety of robots without needing to change code. MoveIt! is architected to be flexible, using a plugin architecture to allow users to integrate their own custom components while still providing out-of-the-box functionality using default implementations. Furthermore, the ROS communication and configuration layer of MoveIt! is separated from core computational components such as motion planning or collision checking, the latter components being provided separately as C++ libraries.
Records 286 to 300 of 483