Peter Thiel succinctly put it in 2024 that artificial intelligence “seems to be a worse word for mathematical people than human”. He probably didn’t anticipate that just two years later, his Palantir co-founder and CEO Alex Karp would use some decidedly flowery language to describe people he considered stupid.
“If Silicon Valley thinks we’re going to take away everyone’s white-collar jobs… then you’re going to screw up the military — and if you think that’s not going to lead to the nationalization of our technology, you’re retarded,” Karp said during a speech at the a16z American Vitality Summit. “You’re probably particularly retarded because you have an IQ of 160.”
Karp commented on a topic taking the world of artificial intelligence by storm: In what capacity do AI companies work with governments? A closer look may explain why Karp’s dissatisfaction may be explained by the squabble between the Pentagon and two completely independent companies: Anthropic and OpenAI.
a16z General Partner Katherine Boyle moderated a breakout session titled “AI Defending the West.”
“If Silicon Valley thinks we’re going to take white-collar jobs away from everyone — basically people who maybe you grew up with Democrats, are highly educated, went to elite schools or went to almost elite schools of one party or another — then you’re going to sue the military. If you don’t think that’s going to lead to the nationalization of our technology, you’re retarded.”
Wow. So what’s bothering Mr. Karp?
While Karp could have chosen less offensive language to make his point, he struck a raw nerve — something that is deeply personal for Palantir. “You can’t have technology that takes away everyone’s jobs at the same time and then be seen as cheating the military,” he said. For Palantir, this tension is not abstract. This is likely to be a real-time operational crisis.
Companies such as Anthropic, OpenAI, Google, and xAI all have contracts with the Department of Defense, and each limits whether its technology can be used in environments that might violate its terms of service. The Department of Defense has been in talks with AI companies to lift these restrictions and allow their technology to be used for “all lawful purposes.” Karp has little patience for companies that view this requirement as a moral red line:
“There is a difference between U.S. military and surveillance,” he told the summit. “Palantir is an anti-surveillance company, no matter what anyone thinks,” he said, pushing back on claims that the company was named after its all-seeing surveillance device. Lord of the Rings It’s fundamentally about surveillance. Karp believes that every technologist knows this is the case, but the proverbial “people online” simply have the wrong idea, “so I end up participating in every conversation I don’t want to be a part of.”
Anthropic CEO Dario Amodei famously said he could not “in good conscience” support the “all lawful purposes” clause. Then, after threats to Anthropic were deemed a military supply chain risk, the government struck a deal with OpenAI to use its tools in classified missions. (Anthropic is reportedly in talks with the Pentagon again, and the Pentagon identified Anthropic’s Claude Opus as key to U.S. and Israeli military preparations for a historic strike on Iran.)
For Palantir, this chain of events is not an abstract concept but a direct threat of action. Palantir’s flagship Artificial Intelligence Platform (AIP) relies on plugging state-of-the-art cutting-edge models into its defense and intelligence workflows. Claude Opus is one of the most capable of these models, acclaimed for his depth of reasoning and reliability in high-stakes environments. Palantir would be unable to use one of its most powerful artificial intelligence engines if Anthropic was blacklisted for military supply chain risks, or if its terms of service effectively barred it from the classified environment in which Palantir operates. It will be forced to restructure its platform around alternative models mid-contract, a costly and reputation-damaging disruption for a company whose entire brand promise is mission-critical reliability.
“Again, there’s a lot of subtlety behind the scenes,” Karp admits. “I’ve been digging into the subtleties of this — what can be deployed and where it can be deployed.”
Karp believes the stakes go far beyond any single Pentagon contract or the policy decisions of any single company. He warned: “The danger for our industry is that there will be the famous horseshoe effect, where people only agree on one thing and that is that this doesn’t pay the bills and that people in our industry should be nationalized.”
In Karp’s words, if AI companies deprive white-collar workers of their livelihoods while simultaneously denying service to the military, then a populist convergence—where both left and right turn to tech—becomes inevitable. He was pointed out who these workers were: “Primarily the Democratic type of people you probably grew up with – highly educated people who went to elite schools, or went to almost elite schools, serving a political party.”
These concerns are already being realized on an economic scale, lending urgency to Karp’s argument. Experts warn that the days of white-collar workers are numbered in the coming AI apocalypse, a destabilizing force that will put most workers out of work. These aren’t just panic-inducing ideas; they have real-world consequences, like when a viral article from Citrini Research triggered massive market turbulence.
In Karp’s view, governments will not allow AI companies to amass the power they already have and still operate in a self-regulatory, non-governmental oversight manner—let alone dictate the terms of use to the government itself. “This is where the road goes,” he said simply. For companies like Palantir, the only way to maintain status, contracts, and access to the cutting-edge AI models that power their platforms is to comply with government rules when required. For Palantir, losing this seat means more than just poor optical performance. This means losing the technical input that makes its core product work.
It would be a dramatic reversal for a company that a month ago delivered what Karp called “one of the truly iconic results in the history of corporate performance or technology” in Palantir’s latest quarterly earnings report.
This story originally appeared on Fortune.com