OpenAI Backtracks on Pentagon Deal Amid AI Surveillance Fears

by Daniel Perez - News Editor
0 comments

OpenAI Amends Pentagon Deal Amid Surveillance Concerns

OpenAI has amended its recently signed contract with the U.S. Department of War (DoW) after CEO Sam Altman acknowledged the original announcement “looked opportunistic and sloppy.” The move comes in response to growing fears that the company’s technology could be used for domestic mass surveillance, a concern that prompted a swift backlash from employees and the public.

Deal Sparks Controversy

The agreement, reached on Friday, followed the Trump administration’s blacklisting of rival AI firm Anthropic and its designation as a “supply chain risk” after a dispute over safeguards. Anthropic CEO Dario Amodei had refused to concede “red lines” prohibiting the use of its Claude model for “mass domestic surveillance” or “fully autonomous weapons.”

President Donald Trump criticized Anthropic, calling them “leftwing nut jobs” and directing federal agencies to phase out the company’s technology over six months. Secretary of Defense Pete Hegseth also designated Anthropic as a supply chain risk, effectively barring Pentagon contractors from doing business with the company. TechCrunch reported on the escalating tensions.

OpenAI Responds to Criticism

In a post on X (formerly Twitter) late Monday, Altman stated that OpenAI would “explicitly” prohibit its systems from being “intentionally used for domestic surveillance of U.S. Persons and nationals.” He also indicated that intelligence agencies, such as the National Security Agency (NSA), would require a “follow-on modification” to the contract before gaining access to OpenAI’s models.

“We shouldn’t have rushed to obtain this out on Friday,” Altman wrote. “The issues are super complex, and demand clear communication. We were genuinely trying to de-escalate things and avoid a much worse outcome, but I think it just looked opportunistic and sloppy.” The Guardian detailed Altman’s admission and the subsequent changes to the agreement.

Employee Concerns and Industry Rivalry

The deal has ignited debate within the AI community, with over 900 employees from OpenAI and Google signing an open letter expressing concerns about the Department of Defense attempting to pressure AI firms into loosening safeguards. The letter warned against the use of AI for domestic mass surveillance and autonomous weapons systems without human oversight. OpenAI’s agreement with the Department of War outlines safety measures, but concerns remain.

The situation highlights the growing rivalry between OpenAI and Anthropic, and the increasing entanglement of AI with U.S. Military strategy. Despite being dropped by the Pentagon, Anthropic’s Claude bot has seen a surge in popularity, topping Apple’s US App Store free app rankings over the weekend. Conversely, ChatGPT has experienced a significant increase in uninstalls, with a nearly 300% jump on Saturday compared to a typical 9%. A “delete ChatGPT” campaign also gained traction on social media.

AI in Military Applications

AI is already integrated into Western military infrastructure, with the U.S., Ukraine, and NATO utilizing analytics platforms like Palantir. These platforms analyze large datasets using commercial AI tools, enabling “faster, more efficient, and ultimately more lethal decisions.” While NATO officials maintain that a human remains “in the loop,” concerns persist about oversight and accountability once these models are fully integrated into military networks.

Experts like Professor Mariarosaria Taddeo of Oxford University have warned that Anthropic’s exclusion from the Pentagon’s ecosystem removes a key “safety-conscious actor” from the equation. President Trump has threatened to invoke the Defense Production Act to compel compliance from AI firms and warned Anthropic of potential “major civil and criminal consequences” for non-cooperation. Reports also suggest Google is in discussions to integrate its Gemini AI model into classified Pentagon systems. BBC News provides further coverage of these developments.

Related Posts

Leave a Comment