Trump-appointed judges refuse to block Trump blacklisting of Anthropic AI tech
Appeals court denies Anthropic's emergency motion for a stay.
A federal appeals court refused to halt the Trump administration's efforts to blacklist Anthropic yesterday, denying the company's emergency motion for a stay. But the court granted the US-based AI firm's request to expedite the case and will hold oral arguments on May 19. The ruling by the US Court of Appeals for the District of Columbia Circuit was issued by a panel of three judges appointed by Republicans, including Trump appointees Gregory Katsas and Neomi Rao. Katsas previously served as deputy counsel to the president during Trump's first term, while Rao served in the Trump administration's Office of Management and Budget. The judges' decision is a setback for Anthropic, but it's only one of two cases it filed against the Trump administration, and the AI firm has had more success in the other one. Anthropic says it exercised its First Amendment rights by refusing to let Claude AI models be used for autonomous warfare and mass surveillance of Americans, and that Trump and Defense Secretary Pete Hegseth blacklisted it in retaliation. Trump directed all federal agencies to stop using Anthropic technology, and Hegseth labeled Anthropic a "Supply-Chain Risk to National Security," prohibiting military contractors from doing business with Anthropic.Read full article Comments
Related tags
Companies and people
Story threads
Continue with this story
Follow the same topic through connected articles, entity pages, and active story threads.
Is Anthropic limiting the release of Mythos to protect the internet — or Anthropic?
Anthropic said this week that it limited the release of its newest model, dubbed Mythos, because it is too capable of finding security exploits in software relied upon by users ...
The gravity of their experience hasn't quite set in for the Artemis II astronauts
"I'm actually getting chills right now just thinking about it. My palms are sweating."
Police corporal created AI porn from driver's license pics
Officer created over 3,000 "deepfake" images.
First man convicted under Take It Down Act kept making AI nudes after arrest
Ohio man used more than 100 AI tools to make fake nudes of women and minors.
CDC study shows COVID shot benefits; Trump official blocks release
Study found shots cut urgent care and hospitalization by about 50% in healthy adults.
First, Tesla canceled the Model 2—now it's working on a new small EV
After the pivot to humanoid robots and AI, does Tesla want to be a car company again?
Entity pages
Ad slot
Article inline monetization block
A reserved partner slot for relevant tools, services, and contextual editorial integrations.
Related articles
More stories that share tags, source, or category context.
Is Anthropic limiting the release of Mythos to protect the internet — or Anthropic?
Anthropic said this week that it limited the release of its newest model, dubbed Mythos, because it is too capable of finding security exploits in software relied upon by users ...
The gravity of their experience hasn't quite set in for the Artemis II astronauts
"I'm actually getting chills right now just thinking about it. My palms are sweating."
Police corporal created AI porn from driver's license pics
Officer created over 3,000 "deepfake" images.
First man convicted under Take It Down Act kept making AI nudes after arrest
Ohio man used more than 100 AI tools to make fake nudes of women and minors.
More from Ars Technica
Fresh reporting and follow-up coverage from the same newsroom.
The gravity of their experience hasn't quite set in for the Artemis II astronauts
"I'm actually getting chills right now just thinking about it. My palms are sweating."
Volkswagen stops building ID.4s in the US, has inventory "into 2027"
Yet another automaker cancels an EV for gasoline SUVs in America.
Police corporal created AI porn from driver's license pics
Officer created over 3,000 "deepfake" images.
First man convicted under Take It Down Act kept making AI nudes after arrest
Ohio man used more than 100 AI tools to make fake nudes of women and minors.