Odos imaging's 1.3 megapixel 2+3D camera can capture accurate 3D images at 100 frames per second; allowing the system to capture very fast moving objects without degradation even in the brightest sunlight. Combining proprietary technology with conventional 2D image capture, an Odos imaging solution provides unambiguous 3D images at video rates from a single unit. Very short, intense pulses of invisible light are used to illuminate the scene. The high intensity of the pulse minimizes the effect of ambient light and allows for outdoor operation. These pulses are reflected by objects within the scene and are detected by the image sensor. Proprietary algorithms convert the detected pulses into a distance measurement. Simultaneously, a conventional 2D image of the scene is captured. Each pixel on the sensor provides both distance and intensity information.
IEEE Spectrum has an article explaining how Google's new autonom ous vehicles project works. The article is based on a recent presentation that Sebastian Thrun and Chris Urmson gave at keynote speech at the IEEE In ternational Conference on Intelligent Robots and Systems . The article can be found here .
The IEEE International Conference on Intelligent Robots and Systems took place a few weeks ago in San Francisco. Willow Garage put together a nice montage video of some of the robots on display. Enjoy.
Torsten Kröger of Standford programmed a robot arm to play the block stacking game Jenga in order to demonstrate the potential of multi-sensor integration in industrial manipulation. The record height the robot was able to achieve was 28 stages, that is, ten additional stages consisting of 29 blocks that were put onto the top of the original tower.
Introducing Coordinated Robotics Lab of University of California San Diego's Switchblade robot . The treads provide traction over a variety of terrain, but Switchblade has some another trick up its sleeve, each tread assembly can pivot relative to the central chassis. We can use this ability to change the center of mass and climb over obstacles. Using internal sensors, we can also balance on the end of the treads and stand upright. Video from the onboard camera is streamed to a remote computer for teleoperation. The control system is robust to external disturbances and the robot will return to its original position if knocked out of the way.
Dr. Cory Kidd's Autom is a robotic personal weight loss coach. A person records their daily diet and exercise routine on the robot's touch screen and Autom gives them vocal encouragement and feedback. Below is the promotional video and more information here.
Microsoft Robotics Developer Studio 4 Beta is a freely available .NET-based programming environment for building robotics applications. It can be used by both professional and non-professional developers as well as hobbyists. Microsoft also released a Reference Platform Design specification . Based on the reference platform Parallax.com is manufacturing a unit called Eddie which they will be shipping in October but is available for pre-order now here.
One of the biggest challenges in prosthetic hand development is designing a method that would let prosthetic hands transmit haptic information — the sense of touch — to patients. Machine Design magazine has an article about Kinea Design's new approach that provides wearers with more sensory information, including contact pressure, friction, texture, and temperature. The full article can be read here.
Armin Hornung made major improvements to the OctoMap 3D mapping library. Scan insertions are now twice as fast as before for real-time map updates and tree traversals are now possible in a flexible and efficient manner using iterators. The new ROS interface provides conversions from most common ROS datatypes, and Octomap server was updated for incremental 3D mapping. Armin also worked on creating a dynamically updatable collision map for tabletop manipulation. The collider package uses OctoMap to provide map updates from laser and dense stereo sensors at a rate of about 10Hz. The complete summary is available here.
“3D Scan 2.0″ is a project at Bergakademie Freiberg University that uses the Microsoft Kinect and a set of AR markers as a 3d scanner. Using the AR markers for positioning guides you move the Kinect camera around the object collecting point clouds that are then assembled into a solid mesh using Poisson Surface Reconstruction. Further information along with the source code is available at the project homepage.
Hizook has a article featuring examples of robots that use simple vibration motors to achieve steerable motion. The website is also looking into producing and selling a tiny (18mm long) IR controlled steerable vibrobot origanlly designed by Naghi Sotoudeh. The article can be found here and be sure to leave them a comment if you would be interested in purchasing a Hizook robot.
In Cornell's Personal Robotics Laboratory, a team led by Ashutosh Saxena, assistant professor of computer science, is teaching robots to manipulate objects and find their way around in new environments. The researchers trained a robot by giving it 24 office scenes and 28 home scenes in which they had labeled most objects. The computer examines such features as color, texture and what is nearby and decides what characteristics all objects with the same label have in common. In a new environment, it compares each segment of its scan with the objects in its memory and chooses the ones with the best fit.
Handroid is a new robotic hand made by Japanese company ITK . It uses a system of tendon-like wires and their differential contraction moves every digit with precision. ITK plans to commercialize the Handroid in about two years for about $6,500 per unit.
James Gosling -- the so-called "father of Java" -- left Google on Tuesday to join a company that is looking to scatter thousands of robots around the Earth's oceans. Gosling will become chief software architect for Sunnyvale startup Liquid Robotics, a 4-year-old company that places 7-foot-long robots resembling surfboards in the ocean to collect and transmit data for a variety of uses. Called Wave Gliders, the devices are powered by wave energy, with the constant up-and-down motion providing energy that pulls the robots through the ocean.
Recently they announced the ISO 10218-1 standard for the robot, and the ISO 10218-2 standard for the robot systems and integration. For more info on what's changed from the older paper and what's been added read more here.
Records 826 to 840 of 876
Robotmaster V6 provides a unique integration of user control, speed and flexibility to Robotmaster's renowned automation and optimization tools. The new and intuitive V6 interface elevates Robotmaster to an unprecedented standard of user experience by giving robot programmers a coherent and dynamic tool that radically saves time and money.