Procurement Cannot Carry the Weight of Military AI Governance
In her March 10, 2026 Lawfare essay, Jessica Tillipman argues that the United States is increasingly relying on procurement instruments and vendor-specific agreements to govern military AI use, even though procurement was not designed to answer foundational questions about surveillance, targeting, accountability, and the lawful limits of state power. She uses recent disputes involving Anthropic, OpenAI, and the Pentagon to show why “regulation by contract” is too narrow, too contingent, and too fragile to function as a durable public-law framework for military AI. (Default)
Tillipman’s central insight is that bilateral contracting is being asked to perform a constitutional and institutional role that it cannot reliably fulfill. As she explains, the rules governing military AI are increasingly emerging not from statutes or regulations of general applicability, but from negotiated terms between the government and individual vendors. That approach may offer speed and flexibility, but it lacks the democratic accountability, transparency, and durability associated with legislation and formal rulemaking. It also binds only the parties to a given agreement, which means governance can become fragmented across vendors, platforms, and contracting pathways. (Default)
The article situates this problem in a recent policy confrontation. Tillipman explains that a January 2026 Pentagon AI strategy memo pushed an “any lawful use” approach, directing the Department of Defense to remove vendor restrictions not required by law and to utilize models free from policy constraints that might limit lawful military applications. In that environment, earlier vendor-imposed red lines on matters such as mass domestic surveillance or fully autonomous weapons became points of conflict rather than baseline safeguards. The result, she suggests, is a bargaining structure in which oversight itself must be negotiated case by case, rather than supplied by a stable external governance regime. (Default)
A particularly important part of Tillipman’s analysis concerns enforceability. Even where contracts appear to include guardrails, those provisions may do less than they seem. She contrasts a model in which a vendor seeks explicit contractual prohibitions with one in which restrictions are framed only by reference to existing law and government policy. In the latter model, interpretive authority largely remains with the government. That matters because the practical question is not whether a clause sounds reassuring in public, but who decides what it means in real time and what remedy exists if disagreement arises. Her answer is sobering: in federal procurement, remedies are usually reactive, delayed, and often monetary, which makes them poorly suited to preventing contested AI uses before they occur. (Default)
Tillipman therefore concludes that contract language, standing alone, cannot substitute for public law. In her view, issues such as domestic surveillance, lethal targeting, and intelligence oversight should not depend on hurried negotiations, opaque agreement structures, or terms later revised under public pressure. For federal government contractors, the article is significant because it highlights a widening gap between acquisition speed and governance maturity. Contractors operating in the AI space may find themselves pressured to absorb policy ambiguity through customized deal terms, technical guardrails, or layered subcontracting structures. Tillipman’s larger warning is that procurement can allocate risk and define performance, but it cannot legitimately resolve society’s deepest questions about how military AI should be governed. Those answers, she contends, must come from Congress, courts, and durable public institutions rather than from contract alone.
Disclaimer:
This summary is provided for informational and educational purposes only. It does not constitute legal advice and should not be relied upon as a substitute for legal counsel. Readers should review the original article by Jessica Tillipman and consult qualified counsel regarding specific procurement, AI governance, or national security matters.