Using 3D data for the autonomous robot.

Adaptive Automation

Contributed by | IDS Imaging Development Systems

Until recently, robots were "blind" command receivers which followed predefined and fixed paths. Using 3D data, robots can adapt flexibly to the particular situation and react to their surroundings. A promise is becoming a reality. The robot is turning into an autonomous employee. Benefits: Fast retooling times, high variance of workpieces, simple teach-in, simplified part feeding with a consistently high degree of automation.

Each step in the process considered, all eventualities ruled out. Thanks to automation, large quantities of units can be produced extremely efficiently. A high degree of specialization further improves efficiency. However, this specialized but expensive equipment falls by the wayside when it comes to flexibility and rapid retooling, as it is simply not cost-effective to produce a small batch of alternative parts. Each step in the process would need to be adapted. Small batches are often laboriously produced and manufactured by hand. While this may be flexible and cost-saving, it is a slow and non-stable process.

 

Robots adapt according to the situation

The development of 3D cameras and 3D-capable software has opened up opportunities for the industry to develop brand new machine vision technologies. Thanks to 3D vision, new tasks can be solved that were not possible with 2D.

One robot removes unsorted and overlapping t-pieces for a tube directly from a small transport box safely and reliably. Another robot depalletizes large aluminum parts directly onto a conveyor belt. The delicate movements of its robust gripper find a firm hold at the first attempt without the slightest collision with the workpiece. This is despite the fact that the parts on the used or dirty pallets are often skewed or leaning due to excess casting flash. Robotics has had to step up its game considerably for this bin picking and transfer of parts in the correct position.

The user can change the order of the process steps easily using Mikado ARC

 

The Freiburg-based systems integration company, isys vision, has developed a solution for this called "MIKADO Adaptive Robot Control" (or ARC for short). It is a configurable robot control with its own collision-free path planning. It uses its own inverse kinematics to calculate the joint angles of the robotic arms for gripping positions or traverse paths. 3D information, such as the workpiece shape, position, location, or a virtual image of the surroundings, is used as the reference point for the complex calculations. A large number of robots available on the market can be controlled using MIKADO ARC and make time-consuming programming unnecessary. Parts can be changed quickly so that even small batches can be produced using this robot-assisted material handling.

 

3D cameras capture the situation

The output data is crucial for optimum control of the robot. The integrator opts for suitable 3D camera technology according to the project and application. This decision depends not only on the general suitability of a method, but also on the cost, precision, speed, and reliable data collection.
 
The pros and cons of the classical methods such as time-of-flight (ToF), stereo vision, or laser triangulation can only be weighed up as an initial selection process. This is because many of the 3D cameras used today are hybrid systems, which use multiple process characteristics to cover a broader range of uses and improve the results.

With a variable base line and a 100 W texture projector, working distances up to 5 meters can be achieved using stereo vision cameras of the Ensenso X series, allowing you to capture objects with volumes of several cubic meters.

 
isys vision uses Ensenso 3D stereo vision cameras for bin picking and material handling. These cameras consist of two area scan cameras, which work according to the stereo vision principle in conjunction with a powerful pattern projector to obtain robust 3D data, even of workpieces with difficult surfaces. The compact cameras of the N series are particularly suitable for close range and are mostly used directly on the head of the robot as a mobile eye. The new 3D system of the X series with its flexible base line can capture large volumes from greater distances ultraflexibly using various cameras from IDS Imaging Development Systems GmbH and is ideally suited for unsorted material handling from large wire-mesh pallets. Thanks to the 100 W power, the projector's LED light generates the finest textures on the workpiece surface even at large working distances of 5 meters. The system does not therefore depend on the ambient light and allows short exposure times. 3D resolutions of a few millimeters are possible even with 1-2 image pairs. With short exposure times, a small number of images, and extremely fast stereo image matching algorithms, 3D data is ready for further processing after just 500 ms. Thus, extremely high cycle times can be achieved in material handling.
 
The additional benefits of using two area scan cameras are obvious. In addition to 3D data collection by the stereo vision, reference characteristics of a scene can also be captured using the raw footage of the area scan cameras and used for the constant readjustment of the machine vision. The process results remain constant and robust. It is no longer necessary to perform recurring checks or time-consuming recalibration of the stereo vision system.
 
With capabilities such as bin picking and part feeding in the correct position, robotics in cooperation with MIKADO ARC and Ensenso 3D cameras can close the gap to adaptive automation. Even small batch production can be automated easily and cost-effectively.
 
 
The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow
IDS Imaging Development Systems Inc.

IDS Imaging Development Systems Inc.

IDS is a leading manufacturer of industrial cameras "Made in Germany" with USB or GigE interfaces. Equipped with state of the art CMOS sensors, the extensive camera portfolio ranges from low-cost project cameras, to small, powerful models with PoE functionality or robust cameras with housings that fulfill the prerequisites for protection code IP65/67. For quick, easy and precise 3D machine vision tasks IDS offers the Ensenso series. With the novel vision app-based sensors and cameras of IDS NXT the company opens up a new dimension in image processing. Whether in an industrial or non-industrial setting: IDS cameras and sensors assist companies worldwide in optimizing processes, ensuring quality, driving research, conserving raw materials, and serving people. They provide reliability, efficiency and flexibility for your application.

Other Articles

2D cameras for positioning and inspecting ultra-fine wires in semiconductor production
Wire bonding is a key process in semiconductor production. Extremely fine wires with diameters of 15 to 75 micrometers are used to create tiny electrical connections between a semiconductor chip and other components.
Picking the right chart - (Semi-)Autonomous surface and underwater mapping for rivers and lakes
On the camera side, the Fraunhofer Institute relies on two uEye FA industrial cameras from IDS. The robust and resilient models with PoE are ideal for demanding environments.
Pallet by pallet - Intelligent robotic vision system destacks up to 800 objects per hour
One of the locations where RODE adds value is for DHL eCommerce in Rotterdam. In this machine, two Ensenso 3D cameras from IDS Imaging Development Systems GmbH are implemented to provide the required image data.
More about IDS Imaging Development Systems Inc.

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Robotmaster® 2024

Robotmaster® 2024

Program multi-robot cells and automatically solve robotic errors with ease. Hypertherm Associates announces a new version to its robotic programming software. Robotmaster 2024 addresses key market trends including the support for programming multiple robots in a single work cell and the demand for automatic trajectory optimization and robotic error correction.