From John Goatstream's Vimeo Videos: We present a muscle-based control method for simulated bipeds in which both the muscle routing and control parameters are optimized. This yields a generic locomotion control method that supports a variety of bipedal creatures. All actuation forces are the result of 3D simulated muscles, and a model of neural delay is included for all feedback paths. As a result, our controllers generate torque patterns that incorporate biomechanical constraints. The synthesized controllers find different gaits based on target speed, can cope with uneven terrain and external perturbations, and can steer to target directions... ( full paper ) ( follow up videos )
From DARPA : DARPA tasks four companies with designing new aircraft to revolutionize vertical takeoff and landing (VTOL) flight capabilities. For generations, new designs for vertical takeoff and landing aircraft have remained unable to increase top speed without sacrificing range, efficiency or the ability to do useful work. DARPA’s VTOL Experimental Plane (VTOL X-Plane) program seeks to overcome these challenges through innovative cross-pollination between the fixed-wing and rotary-wing worlds, to enable radical improvements in vertical and cruise flight capabilities. In an important step toward that goal, DARPA has awarded prime contracts for Phase 1 of VTOL X-Plane to four companies: Aurora Flight Sciences Corporation The Boeing Company Karem Aircraft, Inc. Sikorsky Aircraft Corporation “We were looking for different approaches to solve this extremely challenging problem, and we got them,” said Ashish Bagai, DARPA program manager. “The proposals we’ve chosen aim to create new technologies and incorporate existing ones that VTOL designs so far have not succeeded in developing. We’re eager to see if the performers can integrate their ideas into designs that could potentially achieve the performance goals we’ve set.” VTOL X-Plane seeks to develop a technology demonstrator that could: Achieve a top sustained flight speed of 300 kt-400 kt Raise aircraft hover efficiency from 60 percent to at least 75 percent Present a more favorable cruise lift-to-drag ratio of at least 10, up from 5-6 Carry a useful load of at least 40 percent of the vehicle’s projected gross weight of 10,000-12,000 pounds All four winning companies proposed designs for unmanned vehicles, but the technologies that VTOL X-Plane intends to develop could apply equally well to manned aircraft. Another common element among the designs is that they all incorporate multipurpose technologies to varying degrees. Multipurpose technologies decrease the number of systems in a vehicle and its overall mechanical complexity. Multipurpose technologies also use space and weight more efficiently to improve performance and enable new and improved capabilities. The next major milestone for VTOL X-Plane is scheduled for late 2015, when the four performers are required to submit preliminary designs. At that point, DARPA plans to review the designs to decide which to build as a technology demonstrator, with the goal of performing flight tests in the 2017-18 timeframe.
MIT News : Soft robots — which don't just have soft exteriors but are also powered by fluid flowing through flexible channels — have become a sufficiently popular research topic that they now have their own journal, Soft Robotics. In the first issue of that journal, out this month, MIT researchers report the first self-contained autonomous soft robot, a "fish" that can execute an escape maneuver, convulsing its body to change direction, in just 100 milliseconds, or as quickly as a real fish can.
SimpleCV library for Python: WHAT IS IT? SimpleCV is an open source framework for building computer vision applications. With it, you get access to several high-powered computer vision libraries such as OpenCV – without having to first learn about bit depths, file formats, color spaces, buffer management, eigenvalues, or matrix versus bitmap storage. This is computer vision made easy... ( cont'd )
The Agile Eye by Gosselin, Université Laval: The Agile Eye is a 3-DOF 3-RRR spherical parallel manipulator developed for the rapid orientation of a camera. Its mechanical architecture leads to high velocities and accelerations. The workspace of the Agile Eye is superior to that of the human eye. The miniature camera attached to the end-effector can be pointed in a cone of vision of 140° with ±30° in torsion. Moreover, due to its low inertia and its inherent stiffness, the mechanism can achieve angular velocities above 1000 °/sec and angular accelerations greater than 20000 °/sec2 which is beyond the capabilities of the human eye... ( cont'd ) Hip Joint of the Bipedal Autonomous Robot LISA by Institute of Automatic Control: The hip joint consists of three active rotational degrees of freedom whose rotational axes intersect in one point. In contrast to most hip joints of other bipedal robots LISA's hip joint are built as spherical parallel manipulators. A comparable cardanian joint would lead to a heavier weight and due to the functionality the masses of some engines would have to be accelerated by other engines during motion. Due to the parallel manipulator all engines rest to the trunk. Only a coordinated interaction of all engines leads to a controlled motion of the thigh. This enables a design with a thigh of minimal and a trunk of maximal weight which is an advantageous weight distribution for bipedal walking. Because of the parallel manipulator structure forces applied on the thigh are distributed among all three engines and therefore the power of the engines adds up... ( cont'd )
From Robot Launch : Robot Launch 2014 is open to any robot startup pre/partial Series A. We're looking for startups with prototypes and business models. But we're also interested in any great robot startup idea. What is a robot startup? Well, it could be a robot or an autonomous mobile manipulator. OR it could be an appliance or connected device. OR it could be a sensor or actuator or AI that makes robots better. Prizes include money, mentoring, meetings and free legal and startup services from our supporting organizations, Silicon Valley Robotics, Indiegogo, WilmerHale, Grishin Robotics, Bosch Venture Capital, Lemnos Labs, Luxr, Robolution Capital, Lux Capital, OATV, Khosla Ventures, a showcase at Solid and media coverage by Robohub. Round One entries open Feb 20 Round One entries close March 30 midnight (PST) Top 30 announced April 10 Finalists announced April 30 Final Showcase (tbc) May 20 You can enter here.
From Johnny Lee and the ATAP-Project Tango Team : What is it? Our current prototype is a 5” phone containing customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating a map of the environment. These sensors allow the phone to make over a quarter million 3D measurements every second, updating its position and orientation in real-time, combining that data into a single 3D model of the space around you. It runs Android and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product. How do I get one? We’re looking for professional developers with dreams of creating more than a touch-screen app. These devices were built with the unique ability to sense 3D motion and geometry. We want partners who will push the technology forward and build great user experiences on top of this platform. Currently, we have 200 prototype dev kits. We have allocated some of these devices for projects in the areas of indoor navigation/mapping, single/multiplayer games that use physical space, and new algorithms for processing sensor data. We have also set aside units for applications we haven’t thought of yet. Tell us what you would build. Be creative. Be specific. Be bold. We expect to distribute all of our available units by March 14th, 2014... cont'd
From KUKA's Youtube page: On March 11th 2014 ping pong champion Timo Boll will challenge KUKA's Agilus robot to a ping-pong showdown. Watch the final on March 11th 2014 at www.kuka-timoboll.com to find out the winner. Timo Boll, German table tennis star, is the new brand ambassador for KUKA Robotics in China. The collaboration celebrates the inherent speed, precision, and flexibility of KUKA's industrial robots in tandem with Boll's electrifying and tactical prowess in competition. To celebrate the new KUKA Robotics factory in Shanghai, the two giants will battle to the end on March 11th 2014. The 20,000 sq. meter space will produce the KR QUANTEC series robot as well as the KRC4 universal controller for the Asian market. As a market leader in China, KUKA aims to further develop automation in the country while providing a modern and employee-friendly working environment.
Eugénie von Tunzelmann: Ever since reading Richard Dawkins' book 'The Blind Watchmaker' I'd wanted to try my hand at some evolutionary programming. The idea is to model natural selection inside the computer by generating procedural creatures and allowing them to vary and improve over time without user intervention. The code to build and rig the robots was written in Python, as was the code to run the rigid body simulation, using the Open Dynamics Engine to drive the sim. I wrote an importer for Side Effects' Houdini to read in my robot simulations so I could render them out as pictures.
From Studio diip : “Fish on Wheels” has been developed so fish can steer their tank into a certain direction. Our pet fish have always been limited to their water holding area known as “the fish tank”. In an attempt to liberate fish all over the world, the first self driving car for fish has been developed. This car moves by detecting the fish’s position with computer vision. Up until now driving vehicles has been limited to mankind only (excluding a handful of autonomous vehicles driven by computers), but now your pet fish can also put the pedal to the metal. A prototype version of ”Fish on Wheels” has been constructed using a standard webcam, a battery powered Beagleboard and an Arduino controlled robot vehicle. Using the contrast of the fish with the bottom of the fish tank his position is determined and used to send commands to the Arduino for moving the car into that direction.
From the projects' kickstarter ($104,217 pledged of $5,000 goal): uArm is a 4-axis parallel-mechanism robot arm, inspired by the ABB PalletPack industrial robot arm IRB460. ($185 for complete black kit and a gripper) The basic design is Arduino-controlled with 4 degrees of freedom. Three servos on the base control the main movement of the arm and the mini servo on the top moves and rotates the object. The end-effector of the arm is always kept parallel to the ground. Right now we have already developed a Windows application that allows the uArm to be controlled with keyboard or mouse. With some basic controlling skills, you can use basically any input device to control it, for example, we have also used other remote controller to control the arm. With our imbedded inverse-kinematics algorithm, the uArm can be precisely controlled using coordinates. We have also written an Arduino library specifically for controlling the uArm. So if you are familiar with Arduino, you can program it directly with Arduino IDE. By calling different functions, you can easily move uArm to your desired position without doing tons of hard math... cont'd
Clearpath Robotics has posted part one and two of their ongoing introductory to the ROS operating system: Part One: Intro Since we practically live in the Robot Operating System (ROS), we thought it was time to share some tips on how to get started with ROS. We’ll answer questions like where do I begin? How do I get started? What terminology should I brush up on? Keep an eye out for this ongoing ROS 101 blog series that will provide you with a top to bottom view of ROS that will focus on introducing basic concepts simply, cleanly and at a reasonable pace... cont'd Part Two: Setup And Example In the previous ROS 101 post, we provided a quick introduction to ROS to answer questions like What is ROS? and How do I get started? Now that you understand the basics, here’s how they can apply to a practical example. Follow along to see how we actually ‘do’ all of these things…. cont'd
Hasso-Plattner-Institut : faBrickation is a new approach to rapid prototyping of functional objects, such as the body of a head-mounted display. The key idea is to save 3D printing time by automatically substituting sub-volumes with standard building blocks — in our case Lego bricks. When making the body for a head-mounted display, for example, getting the optical path right is paramount. Users thus mark the lens mounts as “high-resolution” to indicate that these should later be 3D printed. faBrickator then 3D prints these parts. It also generates instructions that show users how to create everything else from Lego bricks.
From Recode: Google is shelling out $400 million to buy a secretive artificial intelligence company called DeepMind. Google confirmed the deal after Re/code inquired about it, but declined to specify a price. Based in London, DeepMind was founded by games prodigy and neuroscientist Demis Hassabis, along with Shane Legg and Mustafa Suleyman... cont'd Additionally, a recently published paper by DeepMind entitled Playing Atari with Deep Reinforcement Learning .
From the Prosthesis projects Indiegogo campaign : Prosthesis: the world's 1st, human controlled racing robot. Formula 1, meet the future. Let the races begin. We are trying to save the future for the humans. With the relentless and unchecked automation of everything we do, we are trying to remind people that technology was invented to improve our quality of life, and that doesn't always mean just doing everything for you. Sometimes that means doing something really, really challenging. Sometimes that means taking on something that many have dreamed of, but no one has dared try before. Like building and learning to pilot a two story tall, 3500kg walking machine that you use your whole body to control, without computers to help you.... cont'd at homepage and Indiegogo .
Records 466 to 480 of 718
The Omron Adept Lynx Cart Transporter is an Autonomous Intelligent Vehicle (AIV) designed to attach to movable carts and transport them from a pickup location to a drop off location. Applications include line-side inventory replenishment, moving flow racks, transporting Work in Progress (WIP) between process steps, and moving finished goods to the warehouse. The Lynx Cart transporter leverages Natural Feature Navigation to autonomously find a path through the facility without the need for any facility modifications necessary.