The Agile Eye by Gosselin, Université Laval:
The Agile Eye is a 3-DOF 3-RRR spherical parallel manipulator developed for the rapid orientation of a camera. Its mechanical architecture leads to high velocities and accelerations.
The workspace of the Agile Eye is superior to that of the human eye. The miniature camera attached to the end-effector can be pointed in a cone of vision of 140° with ±30° in torsion. Moreover, due to its low inertia and its inherent stiffness, the mechanism can achieve angular velocities above 1000 °/sec and angular accelerations greater than 20000 °/sec2 which is beyond the capabilities of the human eye... (cont'd)
Hip Joint of the Bipedal Autonomous Robot LISA by Institute of Automatic Control:
The hip joint consists of three active rotational degrees of freedom whose rotational axes intersect in one point. In contrast to most hip joints of other bipedal robots LISA's hip joint are built as spherical parallel manipulators. A comparable cardanian joint would lead to a heavier weight and due to the functionality the masses of some engines would have to be accelerated by other engines during motion.
Due to the parallel manipulator all engines rest to the trunk. Only a coordinated interaction of all engines leads to a controlled motion of the thigh. This enables a design with a thigh of minimal and a trunk of maximal weight which is an advantageous weight distribution for bipedal walking. Because of the parallel manipulator structure forces applied on the thigh are distributed among all three engines and therefore the power of the engines adds up... (cont'd)
From Robot Launch:
Robot Launch 2014 is open to any robot startup pre/partial Series A. We're looking for startups with prototypes and business models. But we're also interested in any great robot startup idea.
What is a robot startup? Well, it could be a robot or an autonomous mobile manipulator. OR it could be an appliance or connected device. OR it could be a sensor or actuator or AI that makes robots better.
Prizes include money, mentoring, meetings and free legal and startup services from our supporting organizations, Silicon Valley Robotics, Indiegogo, WilmerHale, Grishin Robotics, Bosch Venture Capital, Lemnos Labs, Luxr, Robolution Capital, Lux Capital, OATV, Khosla Ventures, a showcase at Solid and media coverage by Robohub.
- Round One entries open Feb 20
- Round One entries close March 30 midnight (PST)
- Top 30 announced April 10
- Finalists announced April 30
- Final Showcase (tbc) May 20
You can enter here.
What is it?
Our current prototype is a 5” phone containing customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating a map of the environment. These sensors allow the phone to make over a quarter million 3D measurements every second, updating its position and orientation in real-time, combining that data into a single 3D model of the space around you.
It runs Android and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product.
How do I get one?
We’re looking for professional developers with dreams of creating more than a touch-screen app. These devices were built with the unique ability to sense 3D motion and geometry. We want partners who will push the technology forward and build great user experiences on top of this platform.
Currently, we have 200 prototype dev kits. We have allocated some of these devices for projects in the areas of indoor navigation/mapping, single/multiplayer games that use physical space, and new algorithms for processing sensor data. We have also set aside units for applications we haven’t thought of yet. Tell us what you would build. Be creative. Be specific. Be bold.
We expect to distribute all of our available units by March 14th, 2014... cont'd
From KUKA's Youtube page:
On March 11th 2014 ping pong champion Timo Boll will challenge KUKA's Agilus robot to a ping-pong showdown.
Watch the final on March 11th 2014 at www.kuka-timoboll.com to find out the winner.
Timo Boll, German table tennis star, is the new brand ambassador for KUKA Robotics in China. The collaboration celebrates the inherent speed, precision, and flexibility of KUKA's industrial robots in tandem with Boll's electrifying and tactical prowess in competition.
To celebrate the new KUKA Robotics factory in Shanghai, the two giants will battle to the end on March 11th 2014. The 20,000 sq. meter space will produce the KR QUANTEC series robot as well as the KRC4 universal controller for the Asian market. As a market leader in China, KUKA aims to further develop automation in the country while providing a modern and employee-friendly working environment.
Eugénie von Tunzelmann:
Ever since reading Richard Dawkins' book 'The Blind Watchmaker' I'd wanted to try my hand at some evolutionary programming. The idea is to model natural selection inside the computer by generating procedural creatures and allowing them to vary and improve over time without user intervention.
The code to build and rig the robots was written in Python, as was the code to run the rigid body simulation, using the Open Dynamics Engine to drive the sim. I wrote an importer for Side Effects' Houdini to read in my robot simulations so I could render them out as pictures.
From Studio diip:
“Fish on Wheels” has been developed so fish can steer their tank into a certain direction. Our pet fish have always been limited to their water holding area known as “the fish tank”. In an attempt to liberate fish all over the world, the first self driving car for fish has been developed. This car moves by detecting the fish’s position with computer vision. Up until now driving vehicles has been limited to mankind only (excluding a handful of autonomous vehicles driven by computers), but now your pet fish can also put the pedal to the metal.
A prototype version of ”Fish on Wheels” has been constructed using a standard webcam, a battery powered Beagleboard and an Arduino controlled robot vehicle. Using the contrast of the fish with the bottom of the fish tank his position is determined and used to send commands to the Arduino for moving the car into that direction.
From the projects' kickstarter ($104,217 pledged of $5,000 goal):
uArm is a 4-axis parallel-mechanism robot arm, inspired by the ABB PalletPack industrial robot arm IRB460. ($185 for complete black kit and a gripper)
The basic design is Arduino-controlled with 4 degrees of freedom. Three servos on the base control the main movement of the arm and the mini servo on the top moves and rotates the object. The end-effector of the arm is always kept parallel to the ground.
Right now we have already developed a Windows application that allows the uArm to be controlled with keyboard or mouse.
With some basic controlling skills, you can use basically any input device to control it, for example, we have also used other remote controller to control the arm. With our imbedded inverse-kinematics algorithm, the uArm can be precisely controlled using coordinates.
We have also written an Arduino library specifically for controlling the uArm. So if you are familiar with Arduino, you can program it directly with Arduino IDE. By calling different functions, you can easily move uArm to your desired position without doing tons of hard math... cont'd
Clearpath Robotics has posted part one and two of their ongoing introductory to the ROS operating system:
Part One: Intro
Since we practically live in the Robot Operating System (ROS), we thought it was time to share some tips on how to get started with ROS. We’ll answer questions like where do I begin? How do I get started? What terminology should I brush up on? Keep an eye out for this ongoing ROS 101 blog series that will provide you with a top to bottom view of ROS that will focus on introducing basic concepts simply, cleanly and at a reasonable pace... cont'd
Part Two: Setup And Example
In the previous ROS 101 post, we provided a quick introduction to ROS to answer questions like What is ROS? and How do I get started? Now that you understand the basics, here’s how they can apply to a practical example. Follow along to see how we actually ‘do’ all of these things…. cont'd
faBrickation is a new approach to rapid prototyping of functional objects, such as the body of a head-mounted display. The key idea is to save 3D printing time by automatically substituting sub-volumes with standard building blocks — in our case Lego bricks. When making the body for a head-mounted display, for example, getting the optical path right is paramount. Users thus mark the lens mounts as “high-resolution” to indicate that these should later be 3D printed. faBrickator then 3D prints these parts. It also generates instructions that show users how to create everything else from Lego bricks.
Google is shelling out $400 million to buy a secretive artificial intelligence company called DeepMind.
Google confirmed the deal after Re/code inquired about it, but declined to specify a price.
Based in London, DeepMind was founded by games prodigy and neuroscientist Demis Hassabis, along with Shane Legg and Mustafa Suleyman... cont'd
Additionally, a recently published paper by DeepMind entitled Playing Atari with Deep Reinforcement Learning.
From the Prosthesis projects Indiegogo campaign:
Prosthesis: the world's 1st, human controlled racing robot. Formula 1, meet the future. Let the races begin.
We are trying to save the future for the humans. With the relentless and unchecked automation of everything we do, we are trying to remind people that technology was invented to improve our quality of life, and that doesn't always mean just doing everything for you. Sometimes that means doing something really, really challenging. Sometimes that means taking on something that many have dreamed of, but no one has dared try before. Like building and learning to pilot a two story tall, 3500kg walking machine that you use your whole body to control, without computers to help you.... cont'd at homepage and Indiegogo.
NASA engineers are developing climbing legs for the International Space Station's robotic crew member Robonaut 2 (R2), marking another milestone in space humanoid robotics.
The legless R2, currently attached to a support post, is undergoing experimental trials with astronauts aboard the orbiting laboratory. Since its arrival at the station in February 2011, R2 has performed a series of tasks to demonstrate its functionality in microgravity.
These new legs, funded by NASA's Human Exploration and Operations and Space Technology mission directorates, will provide R2 the mobility it needs to help with regular and repetitive tasks inside and outside the space station. The goal is to free up the crew for more critical work, including scientific research.
From the Rex Kickstarter:
Why do you want Rex?
There are two general classes of electronics used in robot hardware: microcontrollers (ex. Arduino) and single-board computers. Microcontrollers are great for projects that only require a single program to be run, quickly and without overhead, like controlling LEDs and motors. Single-board computers are great for anything you'd need a cheap, small computer for - like networking applications and image processing.
Advanced autonomous robots require the strengths of both. A system developed around Rex, being made specifically for robots, brings it all together in one nice little package in a way that has never been done before.
- Texas Instruments DM3730
- 1GHz 32-bit ARM Cortex-A8 Processor core
- 800MHz DSP core
- 512MB LPDDR RAM
- USB Host port
- MicroSD slot
- Camera Module port
- 3.5mm Audio-in jack
- 3.5mm Audio-out jack
- 5V DC input for desktop development
Each Rex will come pre-installed with Alphalem OS, a FOSS custom linux distribution. It includes a core set of built-in device drivers - ones that we've hand-picked as being the most useful for robots (like USB WiFi adapters and cameras). We'll publish the list in a wiki on our website.
Here are the other main features:
- An Arduino-style programming environment with support for multiple programming languages (C, C++, Python).
- A special task manager called the Master Control Program (MCP).
- An API for message passing in multi-process applications.
- A standard Linux filesystem which will allow you to install just about any Linux software that can be cross-compiled for ARM.
- Libraries for common processes such as I2C communication, face detection, and sensor reading.
Records 451 to 465 of 699