In July this year, the U.S. House of Representatives passed the 2023 National Defense Authorization Act (NDAA), which authorizes $839 billion in defense spending for the 2023 fiscal year, which begins on October 1, 2022.
As part of this approval, the House Committee on Armed Services has placed emphasis on activities surrounding emerging technologies and supporting data-driven practices across the U.S. Department of Defense (DoD). The House has instructed the DoD to incorporate a standardized, independent testing and validation process into the lifecycle of AI-enabled models, systems, and applications.
As the committee states: “The committee commends the [DoD] for its progress in working to integrate [AI] into major weapons platforms. The committee, however, is concerned by the low number of AI models that are developed and fielded in operational environments, which hinders the Department’s ability to harness the power of AI.” The legislation then directs the Secretary of Defense to provide a briefing to the committee before January 1, 2023, on the DoD’s efforts to implement this independent testing and validation process.
While the need for testing and validation of AI systems has been included in various strategies and roadmaps across the world, this language within the FY23 NDAA is notable for its focus on ‘independence’ being a key aspect of efficient, responsible AI deployments. This acknowledges that as more AI systems are developed by the U.S. government and various third parties, the need for independent testing and validation in order to ensure robustness and security will be critical.
Read more on how independent testing and validation supports the DoD’s responsible AI strategy.