The first delivery of the Department of Defense’s CJADC2 initiative could be realized this fall during the twelfth round of the Pentagon’s Global Information Dominance Experiment, according to GIDE Director Col. Matt Strohmeyer.
“Our goal is to deliver 1.0 of CJADC2 this fall during GIDE 12,” said Strohmeyer during a panel discussion moderated by Mattermost CEO and Co-founder Ian Tien at the Potomac Officers Club’s 2024 Air Defense Summit.
In order to reach that goal, the Chief Digital and Artificial Intelligence Office is working on two main strategic objectives: global integration and joint kill chains. Under the first objective, Strohmeyer said CDAO aims to give warfighters a more comprehensive picture of the battlespace.
Strohmeyer described global integration as “the ability for the U.S. combatant commands, joint staff and our coalition partners to be able to digitally collaborate on a crisis response decision so that rather than having a regional focus for how we would respond, or having a single domain — like maybe a logistics view of the response or just having an intel view of the response — having a digitally holistic view of the response, and then allowing us to be able to collaborate not just in one region, but globally on what we could do to be able to rapidly respond to a crisis or rapidly create multiple dilemmas for potential adversaries.”
During a recent GIDE, Strohmeyer said his team did a double blind test with generative AI that was successful in accurately summarizing what was going on in an operational logistics environment, which proved to be potentially useful for warfighters. GIDE 8, conducted in December 2023, notably focused on the global integration strategic objective.
CDAO’s second strategic objective, joint kill chains, involves creating the capability to digitally — and to some extent, automatically — close offensive and defensive kill chains.
“We found that the first step is really just getting the data right, getting the workflow right and then applying some algorithms to it. Eventually there’s areas where we think we might be able to apply AI — smartly, with humans cognizant over the decisions that are being made — to allow us to be able to close those kill chains better and faster,” Strohmeyer said.
Other AI Use Cases in the DOD
AI is being used in the Air Force to get human operators off telephone lines and onto ChatOps in an effort to help save time and resources while allowing service members to focus on other more important tasks and decisions.
Lt. Col. Timothy Heaton, who works with the 618th Air Operations Center for the U.S. Air Force’s Air Mobility Command, said AMC is iteratively deploying AI and discovering use cases that could be helpful for decision makers.
“We would execute 60-100 sorties every single day of the year at AMC, and during a crisis, that can go up exponentially. With that comes lots of chats — a sea of chats,” said Heaton. “What we’re running to use in an early rollout of the AI is, what are the trending topics? What should the commanders be interested in right now? Then if we have all those trending topics segmented out, what are some summaries of those so that way we can start to cue the operator to look over here and not there.”
AI Hurdles and How to Overcome Them
As AI is still quite nascent in the federal landscape — especially with recent AI developments like generative AI — there are some challenges government leaders are working to overcome in order to effectively deploy the technology.
“Trust is one of the biggest barriers for AI adoption,” said Leslie Shing, technical staff member in the AI technology and systems group at MIT Lincoln Laboratory. “I think mostly because a lot of this is black box, not really understanding what the underlying mechanisms are, what the inputs and outputs are, what are the main goals in this particular technology.”
There are a few ways to mitigate the trust issue, Shing explained, and the first step is education and training. Government stakeholders need to be informed about the inputs, outputs, transformations and risks associated with AI in order for the technology to really take hold and be trusted in the federal landscape.
Another aspect that needs more consideration is the impact of AI to underlying operations and workflows.
“You can’t just throw the technology over the fence and expect integration immediately,” Shing said. “You have to kind of take baby steps in the integration.”
Open architectures are also critical in making sure AI — and especially the software associated with AI — can be integrated effectively into an organization.
Amar Tappouni, a principal/director with Booz Allen Hamilton’s aerospace account, said, “Open architecture allows replacement of software versus continuous modernization over a long lifecycle. Obviously software is going to have a tail of maintenance to it, but being very specific about the architectural needs and being able to pull out specific capability and replace in the future, I would say should be part of the strategy.”
Want to hear more about AI in the government? Don’t miss the Potomac Officers Club’s 2024 Navy Summit on Aug. 15! The event features two panel discussions on AI: “Naval Autonomy: Way Ahead,” and “Harnessing AI for Mission Impact.” Join the event to learn more.