A research group at the University of Klagenfurt has designed a real-time capable drone based on object-relative navigation using artificial intelligence. Also on board: a USB3 Vision industrial camera from the uEye LE family from IDS Imaging Development Systems GmbH.

Inspection of critical infrastructure using intelligent drones
Inspection of critical infrastructure using intelligent drones

Case Study from | IDS Imaging Development Systems GmbH

The inspection of critical infrastructures such as energy plants, bridges or industrial complexes is essential to ensure their safety, reliability and long-term functionality. Traditional inspection methods always require the use of people in areas that are difficult to access or risky. Autonomous mobile robots offer great potential for making inspections more efficient, safer and more accurate. Uncrewed aerial vehicles (UAVs) such as drones in particular have become established as promising platforms, as they can be used flexibly and can even reach areas that are difficult to access from the air. One of the biggest challenges here is to navigate the drone precisely relative to the objects to be inspected in order to reliably capture high-resolution image data or other sensor data. A research group at the University of Klagenfurt has designed a real-time capable drone based on object-relative navigation using artificial intelligence. Also on board: a USB3 Vision industrial camera from the uEye LE family from IDS Imaging Development Systems GmbH.
As part of the research project, which was funded by the Austrian Federal Ministry for Climate Action, Environment, Energy, Mobility, Innovation and Technology (BMK), the drone has to autonomously recognize what is a power pole and what is an insulator on the power pole. It is supposed to fly around the insulator at a distance of three meters and take pictures. "Precise localization is important such that the camera recordings can be compared across multiple inspection flights," explains Thomas Georg Jantos, PhD student and member of the Control of Networked Systems research group at the University of Klagenfurt. The prerequisite for this is that object-relative navigation must be able to extract so-called semantic information about the objects in question from the raw sensory data captured by the camera. Semantic information makes raw data, in this case the camera images, "understandable" and makes it possible not only to capture the environment, but also to correctly identify and localize relevant objects.
In this case, this means that an image pixel is not only understood as an independent color value (e.g. RGB value), but as part of an object, e.g. an isolator. In contrast to the classic GNNS (Global Navigation Satellite System), this approach not only provides a position in space, but also a precise relative position and orientation with respect to the object to be inspected (e.g. "Drone is located 1.5m to the left of the upper insulator").
The key requirement is that image processing and data interpretation must be latency-free so that the drone can adapt its navigation and interaction to the specific conditions and requirements of the inspection task in real time.


Thomas Jantos with the inspection drone - Photo: aau/Müller


Semantic information through intelligent image processing

Object recognition, object classification and object pose estimation are performed using artificial intelligence in image processing. "In contrast to GNSS-based inspection approaches using drones, our AI with its semantic information enables the infrastructure to be inspected from certain reproducible viewpoints," explains Thomas Jantos. "In addition, the chosen approach does not suffer from the usual GNSS problems such as multi-pathing and shadowing caused by large infrastructures or valleys, which can lead to signal degradation and therefore safety risks."
 

A USB3 uEye LE serves as the quadcopter's navigation camera


How much AI fits into a small quadcopter?

The hardware setup consists of a TWINs Science Copter platform equipped with a Pixhawk PX4 autopilot, an NVIDIA Jetson Orin AGX 64GB DevKit as on-board computer and a USB3 Vision industrial camera from IDS. "The challenge is to get the artificial intelligence onto the small helicopters. The computers on the drone are still too slow compared to the computers used to train the AI. With the first successful tests, this is still the subject of current research," says Thomas Jantos, describing the problem of further optimizing the high-performance AI model for use on the on-board computer.
The camera, on the other hand, delivers perfect basic data straight away, as the tests in the university's own drone hall show. When selecting a suitable camera model, it was not just a question of meeting the requirements in terms of speed, size, protection class and, last but not least, price. "The camera's capabilities are essential for the inspection system's innovative AI-based navigation algorithm," says Thomas Jantos. He opted for the U3-3276LE C-HQ model, a space-saving and cost-effective project camera from the uEye LE family. The integrated Sony Pregius IMX265 sensor is probably the best CMOS image sensor in the 3 MP class and enables a resolution of 3.19 megapixels (2064 x 1544 px) with a frame rate of up to 58.0 fps. The integrated 1/1.8" global shutter, which does not produce any 'distorted' images at these short exposure times compared to a rolling shutter, is decisive for the performance of the sensor. "To ensure a safe and robust inspection flight, high image quality and frame rates are essential," explains Thomas Jantos. As a navigation camera, the uEye LE provides the embedded AI with the comprehensive image data that the on-board computer needs to calculate the relative position and orientation with respect to the object to be inspected. Based on this information, the drone is able to correct its pose in real time.
The IDS camera is connected to the on-board computer via a USB3 interface. "With the help of the IDS peak SDK, we can integrate the camera and its functionalities very easily into the ROS (Robot Operating System) and therefore into our drone," explains Thomas Jantos. In addition, IDS peak enables efficient raw image processing and simple adjustment of parameters such as auto exposure, auto white balancing, auto gain and image downsampling.
To ensure a high level of autonomy, control, mission management, safety monitoring and data recording, the researchers use the source-available CNS Flight Stack on the on-board computer. The CNS Flight Stack includes software modules for navigation, sensor fusion and control algorithms and allows the autonomous execution of reproducible and adjustable missions. "The modularity of the CNS Flight Stack and the ROS interfaces enable us to seamlessly integrate our sensors and the AI-based 'state estimator' for position detection into the entire stack and thus realize autonomous UAV flights. The functionality of our approach is being investigated and developed using the example of an inspection flight around a power pole in the drone hall at the University of Klagenfurt," explains Thomas Jantos.

Information about CNS Flight Stack
Information about the drone hall

Visualisation of the flight path of an inspection flight around an electricity pole model with three insulators in the research laboratory at the University of Klagenfurt


Precise, autonomous alignment through sensor fusion

The high-frequency control signals for the drone are generated by the IMU (Inertial Measurement Unit). The sensor fusion with camera data, LIDAR or GNSS (Global Navigation Satellite System) enables real-time navigation and stabilization of the drone - for example, for position corrections or precise alignment with inspection objects. For the Klagenfurt drone, the IMU of the PX4 is used as a dynamic model in an EKF (Extended Kalman Filter). The EKF estimates where the drone should be now based on the last known position, speed and attitude. New data (e.g. from IMU, GNSS or camera) is then captured at up to 200 Hz and incorprated into the state estimation process.
The camera captures raw images at 50 fps and an image size of 1280 x 960px. "This is the maximum frame rate that we can achieve with our AI model on the drone's onboard computer," explains Thomas Jantos. When the camera is started, an automatic white balance and gain adjustment are carried out once, while the automatic exposure control remains switched off. The EKF compares the prediction and measurement and corrects the estimate accordingly. This ensures that the drone remains stable and can maintain its position autonomously with high precision.
 

Electricity pole with insulators in the drone hall at the University of Klagenfurt is used for test flights


Outlook

"With regard to research in the field of mobile robots, industrial cameras are necessary for a variety of applications and algorithms. It is important that these cameras are robust, compact, lightweight, fast and have a high resolution. On-device pre-processing (e.g. binning) is also very important, as it saves valuable computing time and resources on the mobile robot," emphasizes Thomas Jantos.
With corresponding features, IDS cameras are helping to set a new standard in the autonomous inspection of critical infrastructures in this promising research approach, which significantly increases safety, efficiency and data quality.


Client

The Control of Networked Systems (CNS) research group is part of the Institute for Intelligent System Technologies. It is involved in teaching in the English-language Bachelor's and Master's programs "Robotics and AI" and "Information and Communications Engineering (ICE)" at the University of Klagenfurt. The group’s research focuses on control engineering, state estimation, path and motion planning, modeling of dynamic systems, numerical simulations and the automation of mobile robots in a swarm: More information
 

Camera


uEye LE - the cost-effective, space-saving project camera
Model used:
USB3 Vision Industrial camera U3-3276LE Rev.1.2
Camera family: uEye LE
 

About IDS Imaging Development Systems GmbH
IDS Imaging Development Systems GmbH is a leading manufacturer of industrial cameras and pioneer in industrial image processing. The owner-managed company develops modular concepts of powerful and versatile USB, GigE and 3D camera as well as models with Artificial Intelligence (AI). The almost unlimited range of applications covers multiple non-industrial and industrial sectors of equipment, plant and mechanical engineering. The AI image processing platform IDS NXT is extremely versatile and opens up new areas of application where classic rule-based image processing reaches its limits. With visionpier, IDS operates an online marketplace that brings together suppliers of ready-made image processing solutions and interested end customers in a targeted manner.
Since its foundation in 1997 as a two-man company, IDS has developed into an independent, ISO and environmentally certified family business with around 350 employees. The headquarters in Obersulm, Germany, is both a development and production site. With branches and representative offices in the USA, Japan, South Korea, the UK, France and the Netherlands, the technology company is also globally represented.
 
The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow
IDS Imaging Development Systems Inc.

IDS Imaging Development Systems Inc.

World-class image processing and industrial cameras "Made in Germany". Machine vision systems from IDS are powerful and easy to use. IDS is a leading provider of area scan cameras with USB and GigE interfaces, 3D industrial cameras and industrial cameras with artificial intelligence. Industrial monitoring cameras with streaming and event recording complete the portfolio. One of IDS's key strengths is customized solutions. An experienced project team of hardware and software developers makes almost anything technically possible to meet individual specifications - from custom design and PCB electronics to specific connector configurations. Whether in an industrial or non-industrial setting: IDS cameras and sensors assist companies worldwide in optimizing processes, ensuring quality, driving research, conserving raw materials, and serving people. They provide reliability, efficiency and flexibility for your application.

Other Articles

Multi-camera system with AI and seamless traceability leaves no chance for product defects
VIVALDI Digital Solutions GmbH has developed an exemplary, innovative solution for AI quality inspection in real time. In addition to an edge server with an Intel processor, intelligent image processing plays a key role in the so-called SensorBox.
Automate 2025 Q&A with IDS Imaging
We will be exhibiting our latest and upcoming camera technologies including our newest Event based camera, an upcoming time of flight camera and our new streaming camera family.
Only the changes count - Event-based cameras optimize flow analysis in science and industry
How does the air flow around an airplane? How does the blood move through our veins? And how can pollutant emissions in combustion processes be minimized?
More about IDS Imaging Development Systems Inc.

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Model TR1 Tru-Trac

Model TR1 Tru-Trac

The Model TR1 Tru-Trac® linear measurement solution is a versatile option for tracking velocity, position, or distance over a wide variety of surfaces. An integrated encoder, measuring wheel, and spring-loaded torsion arm in one, compact unit, the Model TR1 is easy to install. The spring-loaded torsion arm offers adjustable torsion load, allowing the Model TR1 to be mounted in almost any orientation - even upside-down. The threaded shaft on the pivot axis is field reversible, providing mounting access from either side. With operating speeds up to 3000 feet per minute, a wide variety of configuration options - including multiple wheel material options - and a housing made from a durable, conductive composite material that minimizes static buildup, the Model TR1 Tru-Trac® is the ideal solution for countless applications.