Diatom design studio are developing an open source drawing robot kit that they hope to sell for just $70 dollars. The kit: "Piccolo", will be powered by an Arduino board, and supports movement along X, Y or Z axes. You can attach a pen, pencil, brush or possibly even an X-Acto knife and it will draw out any sketch you upload to it. They plan on including Arduino and Processing libraries that will allow you to develop dynamic drawings using sensor data. In the video below the prototype draws procedural tree sketches that vary according to the bots proximity to a light source. The kit isn't available yet but on Diatom's website you can sign up now to their mailing list.
Just days after announcing the DARPA Robotics Contest the DARPA website is reporting that Boston Dynamics has been selected as a "sole source" to develop and build the humanoid robots for the contest. Boston Dynamics will build 8 identical humanoids, which will be based on PETMAN. IEEE Spectrum has the full story here .
The Pentagon’s research and development agency has announced a contest to develop ground robotic capabilities to execute complex tasks in dangerous, degraded, human-engineered environments. The program will focus on robots that can utilize available human tools, ranging from hand tools to vehicles.The DARPA Robotics Challenge will consist of three key events – a Virtual Disaster Challenge, and two (2) Disaster Response Challenges. Participation in the Virtual Disaster Challenge is required only for teams working exclusively on control software development. The agency has not yet announced how much it intends to spend on the program or the size of the prize but previous contest awarded two million dollars for the top prize with one million dollars for the runner up. The full contest details and registration can be found here .
The Massachusetts Institute of Technology (MIT) is leading an ambitious new project to reinvent how robots are designed and produced. Funded by a $10 million grant from the National Science Foundation (NSF), the project will aim to develop a desktop technology that would make it possible for the average person to design, customize and print a specialized robot in a matter of hours. The project envisions a future desktop technology that prints actual programmable hybrid electro-mechanical devices from simple descriptions on-demand, anywhere, and with performance one would expect from a team of professional engineers, using advanced materials. The project aims to transform manufacturing as dramatically as the personal computer democratized information technology and transformed how we communicate.
Sand Flea is an 11 pound robot that drives like an RC car on flat terrain, but can jump 30 ft into the air to overcome obstacles. That is high enough to jump over a compound wall, onto the roof of a house, up a set of stairs or into a second story window. The robot uses gyro stabilization to stay level during flight, to provide a clear view from the onboard camera, and to ensure a smooth landing. Sand Flea can jump about 25 times on one charge.
Robotic Industries Association (RIA), the industry’s largest trade group representing over 265 companies, has announced the launch of their Certified Robot Integrator Program. “The new RIA Certified Robot Integrator program began from a simple question: What can the RIA do to help the industry develop more successful robot applications?” said RIA President, Jeff Burnstein. Focus groups were held with leading system integrators and collaborative end users. Users told the RIA that a robot certification program would be useful as a way to help them develop a baseline for the evaluation of robot integrators. System integrators told the RIA this would be a great way for them to benchmark themselves against best industry practices. After more than two years of touring the country to get input from integrators, users, robot suppliers and other interested parties, the program was officially launched in January 2012. “I think there is great excitement about it throughout the industry,” Burnstein explained. EXAM CRITERIA: There are three basic parts to the on-site exam and audit: Hands-On section Expert Response Section: (Participant industry tenure & biography) On-site audit of business infrastructure per completed “Self Score Card”. Supporting evidence will be gathered before any certification date is scheduled. RIA's full press release can be read here . They have also set up a landing page for the Certified Robot Integrator program here .
The all-cash deal for closely held Kiva will close in the second quarter, Seattle-based Amazon said today in a statement. Kiva’s orange robots, which can slide under shelves and bins of products, are used by Quidsi Inc. -- the company behind Soap.com and Diapers.com that Amazon acquired for about $545 million last year. Kiva, whose headquarters will remain in North Reading, Massachusetts, will help Amazon make shipping more efficient, the company said. “Amazon has long used automation in its fulfillment centers, and Kiva’s technology is another way to improve productivity by bringing the products directly to employees to pick, pack and stow,” Dave Clark, vice president of global customer fulfillment at Amazon, said in the statement. Bloomberg has the entire financial details here.
'Making Things See' from O'Reilly Media / Make shows you how to build Kinect projects with inexpensive off-the-shelf components, including the open source Processing programming language and the Arduino microcontroller. Things covered in the book include: Create Kinect applications on Mac OS X, Windows, or Linux Track people with pose detection and skeletonization, and use blob tracking to detect objects Analyze and manipulate point clouds Make models for design and fabrication, using 3D scanning technology Use MakerBot, RepRap, or Shapeways to print 3D objects Delve into motion tracking for animation and games Build a simple robot arm that can imitate your arm movements Discover how skilled artists have used Kinect to build fascinating projects The book is available now on Amazon .
General Motors and NASA are jointly developing a robotic glove that auto workers and astronauts can wear to help do their respective jobs better while potentially reducing the risk of repetitive stress injuries. The Human Grasp Assist device, known internally in both organizations as the K-glove or Robo-Glove, resulted from NASA and GM's Robonaut 2 – or R2 – project, which launched the first humanoid robot into space in 2011. R2 is a permanent resident of the International Space Station. When engineers, researchers and scientists from GM and NASA began collaborating on R2 in 2007, one of the design requirements was for the robot to operate tools designed for humans, alongside astronauts in outer space and factory workers on Earth. The team achieved an unprecedented level of hand dexterity on R2 by using leading-edge sensors, actuators and tendons comparable to the nerves, muscles and tendons in a human hand. Research shows that continuously gripping a tool can cause fatigue in hand muscles within a few minutes, but initial testing of the Robo-Glove indicates the wearer can hold a grip longer and more comfortably. For example, an astronaut working in a pressurized suit outside the space station or an assembly operator in a factory might need to use 15 to 20 pounds of force to hold a tool during an operation but with the robotic glove they might need to apply only five to 10 pounds of force. Inspired by the finger actuation system of R2, actuators are embedded into the upper portion of the glove to provide grasping support to human fingers. The pressure sensors, similar to the sensors that give R2 its sense of touch, are incorporated into the fingertips of the glove to detect when the user is grasping a tool. When the user grasps the tool, the synthetic tendons automatically retract, pulling the fingers into a gripping position and holding them there until the sensor is released.
The Hackengineer web site has complete plans for building a portable 3d camera. The system uses a Texas Instruments DLP pico projector, Leopard Imaging’s Leopardboard 365 VGA camera board, a small 2x telephoto lens, and a BeagleBoard. The system uses the concept know as Structured-light . Structured light uses a set of temporally encoded patterns that are sequentially projected onto the scene. When the pattern is seen from different viewpoints, the pattern appears geometrically distorted due to the surface shape of the object. This information is used to construct the depth data.
Cornell's Creative Machines Lab constructed a robot testbed capable of re-configuring simple truss structures. The robot can add and remove bits and pieces as it goes. The goal of the project is to eventually have similar robots that could be used to assemble structures in difficult situation such as disaster recovery or space exploration.
Achu Wilson is building a personal robot called Chippu. Using Julian, a special version of Julius Speech Recognition Library , he was able to recognize and execute voice commands. He details the process of getting the library working with ROS in his blog post here.
President Obama has signed the FAA Modernization and Reform Act 2012. The bill will allow the FAA to rebuild its air traffic control system to the next generation technology which will include switching from radar to a GPS air traffic control system. The law will open up the skies to unmanned drones by September 2015. According to AUVSI (Association for Unmanned Vehicle Systems International), major UAS provisions in the FAA bill include: Setting a 30 Sept., 2015 deadline for full integration of UAS into the national airspace Requiring a comprehensive integration plan within nine months Requiring the FAA to create a five-year UAS roadmap (which should be updated annually) Requiring small UAS (under 55 pounds) to be allowed to fly within 27 months Requiring six UAS test sites within six months (similar to the language in the already-passed defense bill) Requiring small UAS (under 55 pounds) be allowed to fly in the U.S. Arctic, 24 hours a day, beyond line-of-sight, at an altitude of at least 2,000 feet, within one year Requiring expedited access for public users, such as law enforcement, firefighters, emergency responders Allowing first responders to fly very small UAS (4.4 pounds or less) within 90 days if they meet certain requirements Requiring the FAA to study UAS human factors and causes of accidents
Projet Romeo is being developed by Aldebaran Robotic, the same group working on the NAO . Project Romeo is a 4 foot tall humanoid designed to assist elderly and disabled individuals in their daily activities. The robot will be able to walk through a home, fetching food from the kitchen, taking out the garbage, and acting as a loyal companion who helps entertain its owners and keep tabs on their health. The project started in 2009 but the company hasn't released much info about it until now. Below is the first video of Projet Romeo, sitting in a chair, talking and moving his arms and hands:
Ramses Martinez, Carina Fish, Xin Chen and George Whitesides have published a paper describing a soft pneumatic actuator constructed by combining paper with a silicone elastomer. On pneumatic inflation, these actuators move anisotropically, based on the motions accessible by their composite structures. They are inexpensive, simple to fabricate, light in weight, and easy to actuate. This class of structure is versatile: the same principles of design lead to actuators that respond to pressurization with a wide range of motions (bending, extension, contraction, twisting, and others). Paper, when used to introduce anisotropy into elastomers, can be readily folded into 3D structures following the principles of origami; these folded structures increase the stiffness and anisotropy of the elastomeric actuators, while being light in weight. These soft actuators can manipulate objects with moderate performance; for example, they can lift loads up to 120 times their weight. They can also be combined with other components, for example, electrical components, to increase their functionality.
Records 796 to 810 of 893
Our fully autonomous intelligent vehicles will help you to transform the way you move materials and route your workflows. Increase throughput, eliminate material flow errors, improve traceability, maximize flexibility and allow your employees to focus on higher level tasks. Unlike traditional AGV's, our mobile robotics navigate using the natural features of your facility and do not require expensive facility modifications or guidance. Our AIV's can adapt to changes in their environment and work freely and safely with your staff. Our mobile robots are intelligent enough to quickly learn their environment and then automatically find the optimal path to where they need to go. They also automatically make adjust for dynamic environments and can work together in fleets of up to 100 robots.