Challenges in NASA’s Governance of AI Programs
Published: May 10, 2023
Federal Market AnalysisArtificial Intelligence/Machine LearningNASASpending Trends
According to NASA’s IG, the lack of a unified AI definition jeopardizes the agency’s progress in implementing AI/ML.
NASA has long been a trailblazer in innovative technologies, particularly when it comes to artificial intelligence and machine learning (AI/ML). The agency uses AI/ML in a range of applications, from low Earth orbit weather models to deep space terrain mapping of landing sites.
In fact, NASA is a federal leader in AI and has more published AI use cases than other federal agencies, according to a recent Inspector General (IG) report. Deltek’s analysis of AI contract obligations from FY 2020 to 2022 also found that NASA led civilian agencies in total AI spending with $233M in the three-year period.
Sources: FPDS, Deltek
Additional Observations:
- Spending peaked in FY 2021 due to an increase in task orders at the Johnson Space Center for machine learning to advance risk precursor identification tools in commercial airline terminal
- The Mission Support Directorate led AI spending at NASA with $102M, followed by the Johnson Space Center ($43M) and the Jet Propulsion Laboratory ($43M)
- The majority of NASA AI spending from FY 2020-2022 lies in the R&D category, which totaled $218M in the three-year period
- Sample NASA AI-related obligations from FY 2020-2022 include:
- Task orders from the Intelligent Systems Research and Development Support – 3 contract
- Cooperative Autonomous Distributed Robotic Explorers Tech Demonstration
- Autonomous Rotorcraft Flight Technology Demonstration for Mars 2020 mission
History of NASA AI Governance
Given the agency’s leadership in AI, NASA’s IG reviewed the agency’s progress in AI governance and its approach to the risks of the technology.
Since 2018, NASA has included AI/ML as a key pillar in its digital transformation efforts. In April 2020, NASA created the role of an Artificial Intelligence Machine Learning (AIML) Transformation Lead, who was responsible in following the requirements of federal AI-related executive orders (i.e. EO 13960, Promoting the Use of Trustworthy Artificial Intelligence in the Federal Government), including the development of an AI use case inventory at NASA. The AIML Transformation Lead also helped author NASA’s Framework for Ethical Use of AI in April 2021. In November 2021, NASA established the Office of Technology, Policy, and Strategy (OTPS) to drive data and evidence-based recommendations in agency strategic planning. OTPS also created NASA’s Responsible AI plan.
Challenges
Upon review of NASA’s AI governance framework, the IG found that the agency lacked a unified definition for AI among its programs. The IG uncovered three separate definitions of AI: one in the NASA Framework for the Ethical Use of AI, one in NASA’s Responsible AI Plan, and one on NASA’s AIML SharePoint collaboration website. Though the definition among the three are similar, the agency watchdog warned that the lack of an identical AI definition could lead to hinderances at NASA.
For example, separate definitions may result in duplicate reporting for AI-related hardware and software at the agency. The lack of a thorough inventory would prevent NASA’s, “Ability to effectively monitor and respond to AI specific cybersecurity threats or vulnerabilities,” according to the IG report.
Moreover, implementing future federal AI cybersecurity controls will be more difficult without a singular classification for AI within NASA’s system of record.
Recommendations and Responses:
The IG provided four recommendations in response to its findings. NASA management largely concurred with the findings.
1. Establish a standardized definition for AI within the Agency, to include harmonizing the definitions in the NASA Framework for the Ethical Use of Artificial Intelligence (AI), NASA’s Responsible AI Plan, and NASA AIML SharePoint.
Management Response: Partially concur. NASA will develop a unified definition when NIST releases standard definitions and incorporate it into NASA AI policy by April 30, 2024.
2. Ensure the standardized AI definition is used to identify, update, and maintain the Agency’s AI use case inventory.
Management Response: Concur. Definition will be incorporated into NASA policy by July 31, 2024 once the standardized definition is in place.
3. Identify a classification mechanism to assist in the rapid application of federal requirements for cybersecurity controls and monitoring practices.
Management Response: Concur. NASA will further refine classification/identification processes to respond to bugs, issues or risks in AI by July 31, 2023.
4. Develop a method to track budgets and expenditures for AI use case inventory.
Management Response: Partially concur. As the federal government and NASA refine definitional issues, NASA will continue to evolve cost reporting accordingly