Potential AI Requirements in the Fiscal 2024 Draft House National Defense Authorization Act
Published: July 26, 2023
Federal Market AnalysisUSAFArtificial Intelligence/Machine LearningDEFENSENational Defense Authorization ActPolicy and Legislation
The draft legislation contains AI-related provisions that could result in contracts.
Committees in the U.S. Senate and House of Representatives are busy going through the annual process of considering amendments to the currently 1,242 page long draft version of H.R. 2670, the National Defense Authorization Act (NDAA) for Fiscal Year 2024. While certain politically-charged amendments have grabbed the headlines in recent weeks, the bill also contains several provisions related to artificial intelligence/machine learning (AI/ML).
Today’s post takes a look at two AI/ML-related provisions that are in the draft bill. There is no guarantee they’ll make it into the final legislation, but chances are good enough they will that it bears taking a look at them, and, if they do make it, they could generate a couple of business opportunities.
Section 220. Process to Ensure the Responsible Development and Use of Artificial Intelligence
Concerns about the ethical development of AI/ML, as well as its application to a broad range of subjects, are top of mind across the global technology sector. Congress has apparently noticed this and is trying to get ahead of rapidly developing AI/ML capabilities.
Section 220 requires the Department of Defense’s Chief Digital and Artificial Intelligence Officer (CDAO) to develop and implement a process for assessing whether an AI solution used by the DOD is functioning responsibly. The CDAO is required to report on and remediate any AI/ML solution that is determined not to be functioning responsibly, and to discontinue use of the solution if the flaws in it cannot be remediated quickly.
Implications: Given the complexity of AI/ML capabilities, the CDAO’s process is likely to require an automated solution that enables an AI application to police other AI applications. Elon Musk mentioned back at the start of the ChatGPT hubbub that he had started his AI company with the intent of ensuring the AI it developed acted ethically, meaning it always told “the truth” and could be used to identify mis- and disinformation. In order to ensure that AI solutions being used by the DOD are meeting responsibility criteria, the CDAO will therefore likely need to contract industry for an “AI Cop” of sorts to police the operation of AI/ML capabilities being used across the department. The job is simply too big and too complex for a human team to accomplish.
Section 345. Pilot Program on Optimization of Aerial Refueling and Fuel Management in Contested Logistics Environments through Use of Artificial Intelligence
This provision charges the CDAO with collaborating on a pilot program with the Under Secretary of Defense for Acquisition and Sustainment and the Chief of Staff of the Air Force for optimizing the logistics of aerial refueling and fuel management in contested logistics environments. The prototype program must:
- Evaluate the interoperability and compatibility of AI-enabled systems with the DOD’s existing logistics infrastructure.
- Study ways to enhance situational awareness and decision-making capabilities through real-time data analysis and predictive modeling.
- Address potential challenges and risks associated with integrating AI into refueling processes, including cybersecurity vulnerabilities.
Implications: This provision will undoubtedly result in the award of an Other Transaction Agreement for an AI-enhanced fuel management system. The contested logistics environment angle of the provision will introduce tricky electronic warfare principles to the situation as well.