An Oregon State University computer science professor is part of a team that will receive up to $7.1 million to develop a drone swarm infrastructure to help the U.S. military in urban combat.

OSU researcher part of DARPA grant for autonomous drone swarms

Steve Lundeberg | OSU

An Oregon State University computer science professor is part of a team that will receive up to $7.1 million to develop a drone swarm infrastructure to help the U.S. military in urban combat.

The contract is part of the Defense Advanced Research Project Agency’s OFFSET program, short for Offensive Swarm-Enabled Tactics. The program’s goal, according to DARPA’s website, is “to empower … troops with technology to control scores of unmanned air and ground vehicles at a time.”

Julie A. Adams of OSU’s College of Engineering is on one of two teams of “swarm systems integrators” whose job is to develop the system infrastructure and integrate the work of the “sprint” teams that will focus on swarm tactics, swarm autonomy, human-swarm teaming, physical experimentation and virtual environments.

Raytheon BBN, a key research and development arm of the Raytheon Company, a major defense contractor, leads Adams’ team. The team also includes Smart Information Flow Technologies, a research and development firm. Northrop Grumman, an aerospace and defense technology company, heads the other team of integrators.

Adams, the associate director for deployed systems and policy at the college’s Collaborative Robotics and Intelligent Systems Institute, is the only university-based principal investigator on either team of integrators.

Researchers envision swarms of more than 250 autonomous vehicles – multi-rotor aerial drones, and ground rovers – to gather information and assist troops in “concrete canyon” surroundings where line-of-sight, satellite-based communication is impaired by buildings.

The information the swarms collect can help keep U.S. troops more safe, and civilians in the battle areas more safe as well.

“I specifically will work on swarm interaction grammar – how we take things like flanking or establishing a perimeter and create a system of translations that will allow someone to use those tactics,” Adams said. “We want to be able to identify algorithms to go with the tactics and tie those things together, and also identify how operators interact with the use of a particular tactic."

“Our focus is on the individuals who will be deployed with the swarms, and our intent is to develop enhanced interactive capabilities: speech, gestures, a head tilt, tactile interaction. If a person is receiving information from a swarm, he or she might have a belt that vibrates. We want to make the interaction immersive and more understandable for humans and enable them to interact with the swarm.”

Adams noted that China last summer launched a record swarm of 119 fixed-wing unmanned aerial vehicles.

“Right now we don’t have the infrastructure available for testing the capabilities of large swarms,” Adams said. “Advances have been made with indoor systems, including accurately tracking individual swarm members and by using simulations. Those are good first steps but they don’t match what will happen in the real world. Those approaches allow for testing and validation of some system aspects but they don’t allow for full system validation.”

The integrators’ objective is for operators to interact with the swarm as a whole, or subgroups of the swarm, and not individual agents – like a football coach orchestrating his entire offense as it runs a play.

“What the agents do individually is simple; what they do as a whole is much more interesting,” said Adams, likening a drone swarm to a school of fish acting in concert in response to a predator. “We’ve got these ‘primitives’” – basic actions a swarm can execute – “and we’ll map these primitives to algorithms for the individual agents in the swarm, and determine how humans can interact with the swarm based on all of these things. We want to advance and accelerate enabling swarm technologies that focus on swarm autonomy and how humans can interact and team with the swarm.” 

 

About the Collaborative Robotics and Intelligent Systems Institute (CoRIS)
The OSU College of Engineering established CoRIS in 2017 to advance the theory and design of robotics and artificial intelligence. The institute is committed to exploring the impact of robotics and AI on individuals and society through its three principal impact areas: academics, research, and deployed systems and policy. It is made up of 25 core faculty researchers and 180 graduate students, and another 40 collaborators across the university who apply robotics and AI in their work.

 

The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

3D Vision: Ensenso B now also available as a mono version!

3D Vision: Ensenso B now also available as a mono version!

This compact 3D camera series combines a very short working distance, a large field of view and a high depth of field - perfect for bin picking applications. With its ability to capture multiple objects over a large area, it can help robots empty containers more efficiently. Now available from IDS Imaging Development Systems. In the color version of the Ensenso B, the stereo system is equipped with two RGB image sensors. This saves additional sensors and reduces installation space and hardware costs. Now, you can also choose your model to be equipped with two 5 MP mono sensors, achieving impressively high spatial precision. With enhanced sharpness and accuracy, you can tackle applications where absolute precision is essential. The great strength of the Ensenso B lies in the very precise detection of objects at close range. It offers a wide field of view and an impressively high depth of field. This means that the area in which an object is in focus is unusually large. At a distance of 30 centimetres between the camera and the object, the Z-accuracy is approx. 0.1 millimetres. The maximum working distance is 2 meters. This 3D camera series complies with protection class IP65/67 and is ideal for use in industrial environments.