Moxley Press Politics

A department that could not name its own machines: the document trail behind the USDA inspector general’s AI cybersecurity audit

This week the Agriculture Department’s internal watchdog reported that the agency cannot fully account for the artificial-intelligence systems running on its own networks, has no department-wide pre-deployment review process, and has not implemented the high-impact AI controls required by an Office of Management and Budget memorandum issued in February 2025. The audit, the document chain behind it, and what the agency has, and has not, said in response.

An overhead architectural rendering of an empty agency conference room in cool steel-blue and pale verdigris light. A long laminated table runs the length of the frame; an organisational chart lies face-up on it with most of its boxes empty, connected by faint dashed lines. A closed three-ring binder, stamped with an unreadable reference number, sits beside a closed laptop. No people, no readable text.
The inspector general’s evaluation found four governance gaps: no department-wide high-impact AI controls, no updated OMB-compliant policies, no current inventory process, and no pre-deployment risk review before AI tools reach the USDA network. · Illustration · generated by xAI grok-imagine-image-quality

The Office of the Inspector General at the Department of Agriculture published an inspection report on May 12 finding that the department has not fully implemented the cybersecurity and governance controls the federal government now requires for artificial-intelligence systems, and that it cannot, at present, produce a current inventory of the AI tools running on its own networks. The report, numbered 50801-0018-12 and titled “Cybersecurity of Artificial Intelligence Technology at USDA,” is the first agency-wide evaluation of how a cabinet department is performing against the high-impact AI control regime created by Office of Management and Budget Memorandum M-25-21, signed in February 2025. Its four recommendations remain open. The Department has agreed to all four; the report does not specify when implementation will be complete.

The audit sits in a particular gap. AI governance at the federal level has, since the second Trump administration, been organised around two OMB documents: M-25-21, “Accelerating Federal Use of AI through Innovation, Governance, and Public Trust,” signed in February 2025, and its acquisition companion M-25-22. Both reference a category called high-impact AI, defined by M-25-21 as AI whose output “serves as a principal basis for decisions or actions that have a legal, material, binding, or significant effect on rights or safety.” For systems in that category, the memorandum requires minimum risk-management practices: pre-deployment testing, an independent assessment where appropriate, Chief AI Officer sign-off before launch, a documented ability to pause or discontinue the system, and a process for affected individuals to seek human review. Agencies were directed to report compliance against these practices to OMB by September 22, 2026. The Agriculture Department’s September deadline is now four months away.

Four gaps, in the inspector general’s own words

The inspector general’s report lists four open recommendations. Read together, they describe a department that has built an AI strategy but not the operational scaffolding underneath it. The first recommendation directs USDA to establish department-wide controls that require a high-impact AI assessment, per OMB standards, for any system that meets the M-25-21 definition. The audit notes the absence of such controls at the time of fieldwork. The second directs USDA to update its policies and procedures to incorporate AI in compliance with the M-25-21 requirements. The third directs the department to “create an ongoing process to maintain a current USDA AI inventory.” The fourth directs the department to develop a pre-deployment process, including risk assessments, system authorisation, and impact analysis, before AI technologies are permitted to access the USDA network.

The third and fourth recommendations are the most consequential. The third is, in effect, a finding that the department’s public inventory of AI use cases, last updated in January 2026 and posted at usda.gov/ai/inventory as required by Executive Order 13960, is not the product of an ongoing tracking system but a point-in-time snapshot. Without a continuous inventory, the department cannot answer the predicate question for every subsequent control: what AI systems do we have? The fourth recommendation is a finding that AI tools have, until now, been able to reach the USDA network without a documented pre-deployment risk assessment. That is the gap an attacker would exploit, and the gap a Chief AI Officer cannot close by policy memo alone.

The strategy without the controls

The department has not been silent on AI. In early 2026 it published a Fiscal Year 2025–2026 AI Strategy, announced on February 26, that committed USDA to a five-objective framework: governance, workforce, infrastructure, data, and ethical implementation, with the framework aligned to the National Institute of Standards and Technology AI Risk Management Framework and informed by feedback from more than 200 outside stakeholders. The strategy named a Chief AI and Data Officer, Chris Alvares, sitting inside the Office of the Chief Information Officer, with delegated authority over the M-25-21 compliance plan. The compliance plan itself is USDA’s formal response to M-25-21. The Information Security Committee, the department’s internal governance forum, lists artificial intelligence and robotic process automation among its four standing priorities. None of these documents is what the inspector general’s report says is missing. What is missing, in the audit’s account, is the operational layer beneath them: the inventory process, the pre-deployment review, the binding department-wide control, the updated procedure.

The departmental context is also a factor the report does not fully treat. The previous Chief Information Officer, Gary S. Washington, left the agency entirely in November 2025 after eight years in the role and a brief intermediate post as Chief Innovation Officer. His successor as CIO, Sam Berry, was installed in September 2025. The audit fieldwork that produced the May 12 report spans, in part, the transition period. Whether the gaps the audit identifies are the artefact of an interregnum or a longer-standing condition is a question the report does not, on its public face, resolve.

What the inventory does, and does not, say

USDA has not fully implemented cybersecurity and governance controls within Artificial Intelligence systems in compliance with federal standards, leaving the agency at risk of data breaches or reputational harm. — USDA Office of Inspector General · report 50801-0018-12, Cybersecurity of Artificial Intelligence Technology at USDA, May 12, 2026

The Department of Agriculture inventory of AI use cases, published in the federal open-data catalog and cross-listed on the department’s public AI pages, lists the department’s declared AI use cases, with a short narrative description for each. The inventory is the artefact most readers can touch, and it has been updated in line with Executive Order 13960 and M-25-21. What it does not contain, and what the audit finds is not produced by any internal process, is a current, continuously maintained record that can be cross-referenced against the M-25-21 high-impact category. The CSV is a list; the audit asks for a system. The difference matters because high-impact designation triggers everything else, the independent testing, the pre-deployment authorisation, the Chief AI Officer approval, the discontinuation pathway. Without a process that flags new and changed use cases in near-real time and walks them through the high-impact screen, the September 22 OMB reporting deadline is met, on the present record, by a snapshot rather than a system.

What the department has agreed to

The Department concurred with all four recommendations in its written response to the audit, a response that, on the inspector general’s standard practice, accompanies the published report. The response does not, in the public version of the document, commit to specific implementation dates for any of the four. Under the OMB and inspector-general framework, open recommendations remain on the agency’s public ledger until the inspector general accepts the department’s evidence of closure; there is no statutory deadline by which the agency must close any single recommendation. The September 22 OMB reporting requirement is the only hard external date on the calendar, and that requirement is about reporting compliance, not about closing the inspector general’s recommendations.

The Office of the Inspector General’s public affairs office was contacted by The Moxley Press on the morning of May 12, hours after the report’s release, with three questions: the duration of the audit fieldwork, whether the inspector general’s office has visibility into the September 22 OMB submission USDA will make, and whether any of the four recommendations were closed between the close of fieldwork and the report’s publication. A spokesperson responded the same day, pointed to the published report, and declined to comment on the timeline of the audit or on inter-agency communications. The Department’s Office of Communications was contacted on the same morning with four questions: a target date for closing each of the four open recommendations, whether the Department has produced a draft of the September 22 OMB submission, whether any AI use cases on the public inventory currently meet the M-25-21 high-impact definition, and whether the Chief AI and Data Officer has invoked the M-25-21 pause-or-discontinue authority on any use case. A response had not been received by publication. Mr. Alvares, the Chief AI and Data Officer, was contacted at his published Department address; a response had not been received by publication.

What the audit does not, and cannot, do

The inspector general is an audit office, not an enforcement body. The report’s recommendations have the force of recommendation, not order. The office cannot, on its own authority, halt the use of any AI tool at the Department, cannot direct the Chief AI and Data Officer to invoke the M-25-21 discontinuation pathway on any specific use case, and cannot extend or modify the September 22 OMB reporting deadline. What it can do, and has done, is produce a public document, drawn from internal records, that lists the gaps and dates them. The next document in the chain is the Department’s September 22 OMB submission. Whether that submission, when filed, narrows the gap between the inspector general’s findings and the M-25-21 framework will be measurable on its face. The submission, on the present rule set, is required to be publicly available.

What is unresolved

Several questions are open. The Department has not stated publicly how many of the use cases on its January 2026 public inventory currently meet the M-25-21 high-impact definition; the audit does not produce that count either. The Department has not stated publicly whether any AI use case has been paused, discontinued, or denied authorisation under the M-25-21 framework; the audit does not surface such an instance. The September 22 OMB submission, the central downstream document in this chain, has not yet been filed, and the Department has not, on the public record, committed to publishing the draft. Whether the inspector general’s office will conduct a follow-on evaluation after September 22, comparing the Department’s reported compliance against the gaps catalogued in the May 12 report, is a question the May 12 document does not address.

The statute behind the high-impact framework is not, in this case, statute at all. M-25-21 is an executive-branch memorandum, and the obligations it creates run from the Office of Management and Budget to the Chief AI Officer at each agency. Congress has not, at the time of writing, enacted a federal AI cybersecurity statute that would harden these requirements into law or expose non-compliance to civil enforcement. The inspector general’s May 12 report is, in that sense, the only formal audit instrument currently available to test agency compliance against the framework. What it found at the Agriculture Department, on the public record, is a strategy without the controls underneath it, an inventory without the process behind it, and a network whose AI tools, until the recommendations close, can still arrive without a documented risk assessment.

Corrections
No corrections have been issued for this article. Every Moxley article carries this block — present whether or not a correction has been logged — so the absence is visible and not assumed.
Sources & methods
  1. USDA Office of Inspector General · Report 50801-0018-12, “Cybersecurity of Artificial Intelligence Technology at USDA,” May 12, 2026. The source document for the key finding, the four open recommendations, and the Department’s concurrence. · archived May 16, 2026
  2. Office of Management and Budget · Memorandum M-25-21, “Accelerating Federal Use of AI through Innovation, Governance, and Public Trust,” February 2025. The source of the high-impact AI definition, the minimum risk-management practices, the September 22, 2026 reporting deadline, and the Chief AI Officer authorities. · archived May 16, 2026
  3. data.gov · Department of Agriculture inventory of AI use cases, the federal open-data catalog entry for the Department’s public AI use-case list and CSV companion; the source of the “snapshot rather than a system” comparison in the audit. · archived May 16, 2026
  4. BABL AI · independent coverage of the USDA Fiscal Year 2025–2026 AI Strategy, including the five-objective framework, the NIST AI Risk Management Framework alignment, and the stakeholder-feedback process. · archived May 16, 2026
  5. USDA Office of Inspector General · public landing page, source for the May 2026 press release record and the office’s standard practice on report publication. · archived May 16, 2026
  6. MeriTalk · November 2025 reporting on the departure of long-serving USDA Chief Information Officer Gary S. Washington and the leadership transition behind the audit fieldwork window. · archived May 16, 2026
  7. Federal News Network · September 2025 reporting on the USDA CIO succession, with Sam Berry installed as CIO and Gary Washington moving to a Chief Innovation Officer role before his subsequent departure from the Department. · archived May 16, 2026
  8. Government Executive · November 2025 reporting on the broader executive-branch context of the USDA CIO transition. · archived May 16, 2026

This investigation is built from the public audit and memorandum record. The primary document is USDA Office of Inspector General Report 50801-0018-12, “Cybersecurity of Artificial Intelligence Technology at USDA,” dated May 12, 2026, including its four open recommendations and the Department’s written concurrence. The governing framework is Office of Management and Budget Memorandum M-25-21, issued February 2025, the source of the high-impact AI definition, the minimum risk-management practices, the Chief AI Officer authorities, and the September 22, 2026 reporting deadline. The Department’s governance posture was drawn from the data.gov catalog entry for the USDA AI use-case inventory and from independent coverage at BABL AI of the Department’s Fiscal Year 2025–2026 AI Strategy, announced February 26, 2026. The leadership-transition context, the Gary Washington and Sam Berry succession at the Office of the Chief Information Officer, was drawn from contemporaneous reporting at MeriTalk, Federal News Network, and Government Executive, all published in 2025. The Moxley Press contacted the Office of the Inspector General’s public affairs office, the Department’s Office of Communications, and the Chief AI and Data Officer’s published Department address on the morning of May 12, hours after the report’s release; the inspector general’s office responded the same day and declined to comment beyond the published report; the Department and the Chief AI and Data Officer had not responded by publication. No anonymous sources were used. All cited URLs are linked in the Sources block above and snapshotted to the Moxley Press archive at publication.