Pentagon’s ‘Attempt to Cripple’ Anthropic Is Troubling, Judge Says

US department U.S. District Judge Rita Lynn said during a court hearing on Tuesday that it appeared the Defense Department was illegally punishing Anthropic for trying to restrict the military’s use of its AI tools.

“It feels like an effort to cripple Anthropic,” Lynn said of the Pentagon designating the company a supply-chain risk. “It appears like that [the department] Anthropic is being punished for trying to bring a public investigation into this contract dispute, which would almost certainly be a violation of the First Amendment.

Anthropic has filed two federal lawsuits alleging that the Trump administration’s decision to designate the company as a security risk constitutes illegal retaliation. The government slapped the label on Anthropic after the military insisted on limitations on how its AI could be used. Tuesday’s hearing took place in a case filed in San Francisco.

Anthropic is seeking a temporary order to stop the designation. Anthropic hopes this relief will help convince some of the company’s stingiest customers to stick with it a little longer. Lin may issue a stay only if she determines that Anthropic is likely to win the overall case. Their decision on the injunction is expected in the next few days.

The controversy has sparked broader public conversation about how artificial intelligence is increasingly being used by the armed forces, and whether Silicon Valley companies should defer to the government in determining how to deploy the technology they develop.

The Defense Department, which now calls itself the Department of War (DOW), has argued that it followed procedures and appropriately determined that Anthropic’s AI tools could no longer be trusted to work as expected during critical moments. It asked Lynn not to second-guess his assessment of the national security threat posed by Anthropic.

“The concern is that Anthropic, instead of just raising concerns and pushing back, will say we have a problem with what the DoW is doing and we will manipulate the software … so it doesn’t work the way the DoW expects and wants,” Eric Hamilton, a Trump administration lawyer, said during Tuesday’s hearing.

Lynn said it was Defense Secretary Pete Hegseth’s role in deciding whether Anthropic was a suitable vendor for the department – ​​not him. But Lynn said it was up to him to determine whether Hegseth violated the law by taking steps beyond canceling Anthropic’s government contracts. Lin said it was “troubling” to him that the security designations and directives more broadly limiting the use of Anthropic’s AI tool cloud by government contractors “do not seem consistent with perceived national security concerns.”

As Anthropic’s dispute with the government escalated last month, Hegseth posted on Twitter that “effective immediately, any contractor, supplier, or partner that does business with the United States military may not conduct any business activity with Anthropic.”

But on Tuesday, Hamilton acknowledged that Hegseth has no legal authority to prevent military contractors from using Anthropic for work unrelated to the Defense Department. When Lynn asked why Hegseth would have posted such a thing, Hamilton said, “I don’t know.”

Lynn further questioned Hamilton whether the Pentagon had considered taking less punitive measures to distance the department from the use of Anthropic’s equipment. He described the supply-chain-risk designation as a powerful authority typically reserved for foreign adversaries, terrorists and other hostile actors.

Michael Mongan, a WilmerHale attorney representing Anthropic, said it was extraordinary for the government to go after a “stubborn” negotiating partner with the designation.

The Pentagon has said it is working to replace anthropic technologies with alternatives from Google, OpenAI and xAI in the coming months. It also said it had taken measures to prevent Anthropic from engaging in any tampering during the transition. Hamilton said he did not know whether it was possible for Anthropic to update its AI models without Pentagon permission; The company says that this is not so.

In the second case, in a federal appeals court in Washington, DC, a decision is expected soon without a hearing.



<a href

Leave a Comment