You don’t hate AI because of genuine dislike. No, there’s a $1 billion plot by the ‘Doomer Industrial Complex’ to brainwash you, Trump’s AI czar says

You don’t hate AI because of genuine dislike. No, there’s a $1 billion plot by the ‘Doomer Industrial Complex’ to brainwash you, Trump’s AI czar says

2 minutes, 57 seconds Read

That disconnect, David Sacks insists, isn’t because AI threatens your job, privacy and the future of the economy itself. No – according to the venture-capitalist-turned-Trump-advisor, it’s all part of a $1 billion plot by what he calls the “Doomer Industrial Complex,” a shadow network of Effective Altruist billionaires bankrolled by the likes of convicted FTX founder Sam Bankman Fried  and Facebook co-founder Dustin Moskovitz. 

In an X post this week, Sacks argued that public distrust of AI isn’t organic at all — it’s manufactured. He pointed to research by tech-culture scholar Nirit Weiss-Blatt, who has spent years mapping the “AI doom” ecosystem of think tanks, nonprofits, and futurists.

Weiss-Blatt documents hundreds of groups that promote strict regulation or even moratoriums on advanced AI systems. She argues that much of the money behind those organizations can be traced to a small circle of donors in the Effective Altruism movement, including Facebook co-founder Dustin Moskovitz, Skype’s Jaan Tallinn, Ethereum creator Vitalik Buterin, and convicted FTX founder Sam Bankman-Fried.

According to Weiss-Blatt, those philanthropists have collectively poured more than $1 billion into efforts to study or mitigate “existential risk” from AI. However, she pointed at Moskovitz’s organization, Open Philanthropy, as “by far” the largest donors. 

The organization pushed back strongly on the idea that they were projecting sci-fi-esque doom and gloom scenarios.

“We believe that technology and scientific progress have drastically improved human well-being, which is why so much of our work focuses on these areas,” an Open Philanthropy spokesperson told Fortune. “AI has enormous potential to accelerate science, fuel economic growth, and expand human knowledge, but it also poses some unprecedented risks — a view shared by leaders across the political spectrum. We support thoughtful nonpartisan work to help manage those risks and realize the huge potential upsides of AI.”

But Sacks, who has close ties to Silicon Valley’s venture community and served as an early executive at PayPal, claims that funding from Open Philanthropy has done more than just warn of the risks– it’s bought a global PR campaign warning of “Godlike” AI. He cited polling showing that 83% of respondents in China view AI’s benefits as outweighing its harms — compared with just 39% in the United States — as evidence that what he calls “propaganda money” has reshaped the American debate.

Sacks has long pushed for an industry-friendly, no regulation approach to AI –and technology broadly—framed in the race to beat China. 

Sacks’ venture capital firm, Craft Ventures, did not immediately respond to a request for comment.

What is Effective Altruism?

The “propaganda money” Sacks refers to comes largely from the Effective Altruism (EA) community, a wonky group of idealists, philosophers, and tech billionaires who believe humanity’s biggest moral duty is to prevent future catastrophes, including rogue AI.

The EA movement, founded a decade ago by Oxford philosophers William MacAskill and Toby Ord, encourages donors to use data and reason to do the most good possible. 

That framework led some members to focus on “longtermism,” the idea that preventing existential risks such as pandemics, nuclear war, or rogue AI should take priority over short-term causes.

While some EA-aligned organizations advocate heavy AI regulation or even “pauses” in model development, others – like Open Philanthropy– take a more technical approach, funding alignment research at companies like OpenAI and Anthropic. The movement’s influence grew rapidly before the 2022 collapse of FTX, whose founder Bankman-Fried had been one of EA’s biggest benefactors.

Matthew Adelstein, a 21-year-old college student who has a prominent Substack on EA, notes that the landscape is far fro

Read More

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *