A research project at Harvard aims to demonstrate an autonomous multi-robot systems capable of the construction of 3D structures. The hardware comprises a mobile robot and specialized passive blocks; the robot is able to manipulate blocks to build desired structures, and can maneuver on these structures as well as in unstructured environments. To illustrate the robot’s ability to perform complex tasks combining these functions, they demonstrate it autonomously building a ten-block structure significantly larger than itself. The full paper can be found here .
From the video its hard to tell what technology is behind this robot but heres the Google translation of the press release: Communication robots for the elderly "Nodding Kabochan SMILE SUPPLEMENT ROBOT" Launched mid-November! Pumpkin is not nodding, communication robots for the elderly in the shape of a little boy. The reaction of five sensors that are built, or nodding, He makes a variety of chat depending on the season. To take advantage of robot technology, the elderly, who also receive care, How to care for, and further engaged in the business who care, safe delivery and a smile will help to improve life.
Parallel-link robot deployments will continue to see strong growth as more and more operations across all industries embrace automation as a way to cut costs and stay completive. When light payloads and very fast cycle times are required, whether it be assembly, picking, dispensing, or any number of other applications, parallel-link robots will continue to meet the challenge.
Robotic machining has additionally introduced new possibilities for surface texturing on the stone surface, significantly adding to the aesthetics, presenting an innovative and pleasing artistic appearance. The textures would be difficult if not impossible to achieve by any other production process.
TX90 robots - Reliable staff for tough conditions in the food industry
ROSCon 2012 , the first ROS developer's conference is a weekend conference which will comprise of tech talks and tutorials that will introduce you to new ROS tools and libraries. The event takes place in St. Paul, Minnesota May 19-20, right after the 2012 IEEE international Conference on Robotics and Automation.
The Xtion Pro Live by ASUS is similar to Microsoft's Kinect hardware. The hardware includes a depth sensor, RGB camera and set of microphones. The device is aimed at development and is powered through the USB 2 connection rather than requiring an addition power connection like the Kinect. Its also slightly smaller than the Kinect. I Heart Robotics has received one and they have additional information here . Currently its only available for purchase directly from ASUS and is currently on backorder.
PETMAN is an anthropomorphic robot for testing chemical protection clothing. Unlike previous suit testers, which had to be supported mechanically and had a limited repertoire of motion, PETMAN will balance itself and move freely; walking, crawling and doing a variety of suit-stressing calisthenics during exposure to chemical warfare agents.
Objet Ltd. with be demonstrating their newest 3D Printer at Wired Magazine's inaugural conference in London. The Objet260 Connex is able to use 60 different materials and it can simultaneously build 14 different materials into a single model part with a 16-micron print layer accuracy.
DigInfo.tv, a Tokyo-based website recently posted a video from a presentation by the Japan’s Ministry of Defense. The flying orb weighs 350 g, is 42 cm in diameter, and is made of commercially available parts costing a total of around US$1,400. The video below is from the public unveiling:
Full automatic packaging line of parenteralia in pharmaceutical industry.
Odos imaging's 1.3 megapixel 2+3D camera can capture accurate 3D images at 100 frames per second; allowing the system to capture very fast moving objects without degradation even in the brightest sunlight. Combining proprietary technology with conventional 2D image capture, an Odos imaging solution provides unambiguous 3D images at video rates from a single unit. Very short, intense pulses of invisible light are used to illuminate the scene. The high intensity of the pulse minimizes the effect of ambient light and allows for outdoor operation. These pulses are reflected by objects within the scene and are detected by the image sensor. Proprietary algorithms convert the detected pulses into a distance measurement. Simultaneously, a conventional 2D image of the scene is captured. Each pixel on the sensor provides both distance and intensity information.
Willingness to invest in future yields big dividends with 76% return on investment in first year.
IEEE Spectrum has an article explaining how Google's new autonom ous vehicles project works. The article is based on a recent presentation that Sebastian Thrun and Chris Urmson gave at keynote speech at the IEEE In ternational Conference on Intelligent Robots and Systems . The article can be found here .
The IEEE International Conference on Intelligent Robots and Systems took place a few weeks ago in San Francisco. Willow Garage put together a nice montage video of some of the robots on display. Enjoy.
Records 1771 to 1785 of 1845
Join Grant Imahara in meeting with KUKA to learn how human-robot collaboration and robot learning is transforming the workplace. Is Industry 4.0 a future where robots and humans all hold hands? Tune in to see.