Future robots might incorporate the ability for a surgeon to program the surgery and just supervise the procedure, as the robot performs most of the tasks. The possibilities for improvement and advancement are only limited by imagination and cost.
This document explains how companies can utilize advanced and emerging technologies to help deliver superior results. Prepare for the unexpected, understand your process, look at all the options, go back to the fundamentals, reengineer, gain stakeholder acceptance through a pilot program, and finally deliver. The savings are there for the taking.
The small nanobots that are being deployed to fight cancer are nothing like what we imagine. Instead of being made of metal, plastic, and circuitry, cancer nanobots are created using "DNA origami," or "folding" DNA chains to form a barrel-shaped container for a payload of cancer antibodies.
The all-cash deal for closely held Kiva will close in the second quarter, Seattle-based Amazon said today in a statement. Kiva’s orange robots, which can slide under shelves and bins of products, are used by Quidsi Inc. -- the company behind Soap.com and Diapers.com that Amazon acquired for about $545 million last year. Kiva, whose headquarters will remain in North Reading, Massachusetts, will help Amazon make shipping more efficient, the company said. “Amazon has long used automation in its fulfillment centers, and Kiva’s technology is another way to improve productivity by bringing the products directly to employees to pick, pack and stow,” Dave Clark, vice president of global customer fulfillment at Amazon, said in the statement. Bloomberg has the entire financial details here.
'Making Things See' from O'Reilly Media / Make shows you how to build Kinect projects with inexpensive off-the-shelf components, including the open source Processing programming language and the Arduino microcontroller. Things covered in the book include: Create Kinect applications on Mac OS X, Windows, or Linux Track people with pose detection and skeletonization, and use blob tracking to detect objects Analyze and manipulate point clouds Make models for design and fabrication, using 3D scanning technology Use MakerBot, RepRap, or Shapeways to print 3D objects Delve into motion tracking for animation and games Build a simple robot arm that can imitate your arm movements Discover how skilled artists have used Kinect to build fascinating projects The book is available now on Amazon .
General Motors and NASA are jointly developing a robotic glove that auto workers and astronauts can wear to help do their respective jobs better while potentially reducing the risk of repetitive stress injuries. The Human Grasp Assist device, known internally in both organizations as the K-glove or Robo-Glove, resulted from NASA and GM's Robonaut 2 – or R2 – project, which launched the first humanoid robot into space in 2011. R2 is a permanent resident of the International Space Station. When engineers, researchers and scientists from GM and NASA began collaborating on R2 in 2007, one of the design requirements was for the robot to operate tools designed for humans, alongside astronauts in outer space and factory workers on Earth. The team achieved an unprecedented level of hand dexterity on R2 by using leading-edge sensors, actuators and tendons comparable to the nerves, muscles and tendons in a human hand. Research shows that continuously gripping a tool can cause fatigue in hand muscles within a few minutes, but initial testing of the Robo-Glove indicates the wearer can hold a grip longer and more comfortably. For example, an astronaut working in a pressurized suit outside the space station or an assembly operator in a factory might need to use 15 to 20 pounds of force to hold a tool during an operation but with the robotic glove they might need to apply only five to 10 pounds of force. Inspired by the finger actuation system of R2, actuators are embedded into the upper portion of the glove to provide grasping support to human fingers. The pressure sensors, similar to the sensors that give R2 its sense of touch, are incorporated into the fingertips of the glove to detect when the user is grasping a tool. When the user grasps the tool, the synthetic tendons automatically retract, pulling the fingers into a gripping position and holding them there until the sensor is released.
Most of the medical advances that we have seen have been with pharmaceuticals, as drug companies compete to introduce new more effective drugs because the patents on many blockbuster drugs are about to expire. But the coolest advances have to do with medical equipment. The age of high-tech medicine is here with even greater advances in development.
A robot has multiple axis, so a wireless switch is sensing position on those different axis. A Limitlessâ„˘ wireless solution includes wireless switches and I/O devices that are paired and communicate with a PLC or controller interface.
Clamping applications often rely on sensors to detect whether the jaws or grippers are in the proper position - open or closed. Though other technologies can be used in place of sensors to determine the open/closed conditions, sensor implementation can increase reliability and obtain data that only a detection device very near the application can provide.
The Warehouse Group took to the Robotic Industrial Truck right away when they saw how it helped with their workload. On the manufacturing side, an unexpected secondary benefit is that we removed clutter and unsightly pallets from the work areas so the trucks can maneuver, providing a cleaner, safer work environment.
The Hackengineer web site has complete plans for building a portable 3d camera. The system uses a Texas Instruments DLP pico projector, Leopard Imaging’s Leopardboard 365 VGA camera board, a small 2x telephoto lens, and a BeagleBoard. The system uses the concept know as Structured-light . Structured light uses a set of temporally encoded patterns that are sequentially projected onto the scene. When the pattern is seen from different viewpoints, the pattern appears geometrically distorted due to the surface shape of the object. This information is used to construct the depth data.
Cornell's Creative Machines Lab constructed a robot testbed capable of re-configuring simple truss structures. The robot can add and remove bits and pieces as it goes. The goal of the project is to eventually have similar robots that could be used to assemble structures in difficult situation such as disaster recovery or space exploration.
Achu Wilson is building a personal robot called Chippu. Using Julian, a special version of Julius Speech Recognition Library , he was able to recognize and execute voice commands. He details the process of getting the library working with ROS in his blog post here.
President Obama has signed the FAA Modernization and Reform Act 2012. The bill will allow the FAA to rebuild its air traffic control system to the next generation technology which will include switching from radar to a GPS air traffic control system. The law will open up the skies to unmanned drones by September 2015. According to AUVSI (Association for Unmanned Vehicle Systems International), major UAS provisions in the FAA bill include: Setting a 30 Sept., 2015 deadline for full integration of UAS into the national airspace Requiring a comprehensive integration plan within nine months Requiring the FAA to create a five-year UAS roadmap (which should be updated annually) Requiring small UAS (under 55 pounds) to be allowed to fly within 27 months Requiring six UAS test sites within six months (similar to the language in the already-passed defense bill) Requiring small UAS (under 55 pounds) be allowed to fly in the U.S. Arctic, 24 hours a day, beyond line-of-sight, at an altitude of at least 2,000 feet, within one year Requiring expedited access for public users, such as law enforcement, firefighters, emergency responders Allowing first responders to fly very small UAS (4.4 pounds or less) within 90 days if they meet certain requirements Requiring the FAA to study UAS human factors and causes of accidents
Projet Romeo is being developed by Aldebaran Robotic, the same group working on the NAO . Project Romeo is a 4 foot tall humanoid designed to assist elderly and disabled individuals in their daily activities. The robot will be able to walk through a home, fetching food from the kitchen, taking out the garbage, and acting as a loyal companion who helps entertain its owners and keep tabs on their health. The project started in 2009 but the company hasn't released much info about it until now. Below is the first video of Projet Romeo, sitting in a chair, talking and moving his arms and hands:
Records 1606 to 1620 of 1734
The ST Robotics Workspace Sentry robot and area safety system are based on a small module that sends an infrared beam across the workspace. If the user puts his hand (or any other object) in the workspace, the robot stops using programmable emergency deceleration. Each module has three beams at different angles and the distance a beam reaches is adjustable. Two or more modules can be daisy chained to watch a wider area. "A robot that is tuned to stop on impact may not be safe. Robots where the trip torque can be set at low thresholds are too slow for any practical industrial application. The best system is where the work area has proximity detectors so the robot stops before impact and that is the approach ST Robotics has taken," states President and CEO of ST Robotics David Sands.