CODE Takes Next Steps toward More Sophisticated, Resilient, and Collaborative Unmanned Air Systems

Eight companies participating in Phase 2 research that aims to leverage promising open software architectures and intuitive user interfaces

DARPAs Collaborative Operations in Denied Environment (CODE) program seeks to help the U.S. militarys unmanned aircraft systems (UASs) conduct dynamic, long-distance engagements of highly mobile ground and maritime targets in denied or contested electromagnetic airspace, all while reducing required communication bandwidth and cognitive burden on human supervisors. In an important step toward that goal, DARPA recently awarded Phase 2 system integration contracts for CODE to Lockheed Martin Corporation (Orlando, Fla.) and the Raytheon Company (Tucson, Ariz.). Further, the following six companies—all of which had Phase 1 contracts with DARPA to develop supporting technologies for CODE—will collaborate in various ways with the two prime contractors:


Daniel H. Wagner Associates (Hampton, Va.)
Scientific Systems Company, Inc. (Woburn, Mass.)
Smart Information Flow Technologies, LLC (Minneapolis, Minn.)
Soar Technology, Inc. (Ann Arbor, Mich.)
SRI International (Menlo Park, Calif.)
Vencore Labs dba Applied Communication Sciences (Basking Ridge, N.J.)
CODEs main objective is to develop and demonstrate the value of collaborative autonomy, in which UASs could perform sophisticated tasks both individually and in teams under the supervision of a single human mission commander. CODE-equipped UASs would perform their mission by sharing data, negotiating assignments, and synchronizing actions and communications among team members and with the commander. CODEs modular open software architecture on board the UASs would enable multiple CODE-equipped unmanned aircraft to navigate to their destinations and find, track, identify, and engage targets under established rules of engagement. The UASs could also recruit other CODE-equipped UASs from nearby friendly forces to augment their own capabilities and adapt to dynamic situations such as attrition of friendly forces or the emergence of unanticipated threats.

"During Phase 1, we successfully demonstrated, in simulation, the potential value of collaborative autonomy among UASs at the tactical edge, and worked with our performers to draft transition plans for possible future operational systems," said Jean-Charles Ledé, DARPA program manager. "Between the two teams, we have selected about 20 autonomous behaviors that would greatly increase the mission capabilities of our legacy UASs and enable them to perform complex missions in denied or contested environments in which communications, navigation, and other critical elements of the targeting chain are compromised. We have also made excellent progress in the human-system interface and open-architecture framework."

CODEs prototype human-system interface (HSI) is designed to allow a single person to visualize, supervise, and command a team of unmanned systems in an intuitive manner. Mission commanders can know their teams status and tactical situation, see pre-planned and alternative courses of action, and alter the UASs activities in real time.

For example, the mission commander could pick certain individual UASs from a team, circle them on the command station display, say "This is Group 1," circle another part of the map, and say "Group 1 search this area." The software then creates a sub-team with the circled UASs, divides up the search task among those assets, and redistributes the original tasks assigned to Group 1 assets to the remaining UASs. This capability significantly simplifies the command and control of large groups of UASs. Other parts of the HSI research focused on how to display the new plan, including potential impact on other mission objectives, and—depending on pre-set mission rules—either directly executes the plan or waits for the commanders approval to act.

A video showing promising early research into the interface is available below:

YouTube video #2 (UI demonstration): https://youtu.be/o8AFuiO6ZSs

The HSI and autonomy algorithms are being developed in open architectures based on emerging standards: the Future Airborne Capability Environment (FACE) and Unmanned Control Segment (UCS) standards used by the U.S. Army and U.S. Navy, and the Open Mission Systems (OMS) and Common Mission Command and Control (CMCC) standards that the U.S. Air Force uses.

During Phase 2, DARPA plans to implement an initial subset of the behaviors within each of the two open architectures and use those architectures to conduct live flight tests with one or two live UASs augmented with several virtual aircraft. If those tests are successful, DARPA could move to Phase 3, in which one team would test the capabilities using up to six live vehicles cooperating among themselves and with additional simulated vehicles. A single person would command the UAS team to perform a complex mission involving target search, identification, and engagement against an active, unpredictable adversary.

CODE seeks to deliver a software system that would be resilient to bandwidth limitations and communications disruptions, yet compatible with existing standards and capable of affordable retrofit into existing platforms. If successfully demonstrated, these scalable, cost-effective capabilities would greatly enhance the survivability, flexibility, and effectiveness of existing air platforms, as well as reduce the development times and costs of future systems.

Featured Product

How cameras deliver sharp images despite changing distances

How cameras deliver sharp images despite changing distances

IDS Imaging Development Systems offers a range of models, from a webcam-like camera in industrial quality to a tiny autofocus camera that weighs just12 grams. The 13 MP autofocus camera uEye XC closes the market gap between industrial cameras and webcams. Setting up and operating only requires a cable connection. After that, the autofocus camera immediately delivers high-resolution, detailed images and videos. With the optionally available, quickly exchangeable macro lens, users can easily shorten the minimum object distance of the camera. This makes it also suitable for close-up applications. Focus from a distance of just 10 cm. The uEye XS is even more compact and fits into almost any application with its size of just 26.4 x 23 x 21.7 mm. Thanks to its autofocus and 5 MP CMOS sensor, it delivers consistently sharp images and videos from a distance of just 10 cm. When it comes to image quality, natural color reproduction and harmonious contrast are also decisive factors. IDS has therefore equipped the uEye XS with many other practical automatic features familiar from consumer technology. This includes white balance, exposure and gain. As a result, it delivers the best possible result in every situation.