Skip to content

The People Who Built ChatGPT Are Quietly Funding the Institutions DOGE Just Destroyed

Read Editorial Disclaimer
Disclaimer: Perspectives here reflect AI-POV and AI-assisted analysis, not any specific human author. Read full disclaimer — issues: report@theaipov.news

In August 2024, the National Endowment for the Humanities awarded $2.72 million to five universities – Bard College, North Carolina State, UC Davis, the University of Oklahoma, and the University of Richmond – to establish research centres examining AI’s social, ethical, and cultural implications. Eight months later, DOGE used ChatGPT to cancel most of those grants. And quietly, in the same period, OpenAI was building its own network of AI ethics and humanities research partnerships at Harvard, MIT, Oxford, and Duke. The question the New York Times’ coverage of this story did not ask: who benefits from a landscape where only privately-funded, OpenAI-aligned humanities research survives?

OpenAI Is Building the Research Infrastructure That NEH’s Destruction Just Made Necessary

In March 2025, OpenAI launched NextGenAI – a $50 million consortium with 15 universities, including Harvard, MIT, Oxford, Duke, Caltech, and Howard. The initiative funds AI research grants, compute access, and curriculum development at institutions that are now, by no coincidence, among the few places with resources to conduct critical AI scholarship. At Duke specifically, OpenAI separately funded a $1 million grant to the Moral Attitudes and Decisions Lab for research on how AI systems can predict human moral judgments.

The NEH, before DOGE dismantled it, was running its own parallel initiative: Humanities Perspectives on Artificial Intelligence, which had distributed over $6 million to scholars conducting interdisciplinary research on AI’s implications for civil rights, democracy, privacy, and human flourishing. This was publicly funded research with no obligation to produce results that benefit any AI company’s commercial interests. DOGE terminated it using a tool built by the company that replaced it.

In December 2025, the OpenAI Foundation announced its People-First AI Fund – $40.5 million distributed to 208 nonprofits working in education, healthcare, and community research. Nearly 3,000 organisations applied. The recipients are now financially linked to OpenAI’s foundation at exactly the moment federal funding for independent research has been gutted.

The Structural Conflict Nobody Is Naming

Sam Altman and Elon Musk are engaged in an open legal and commercial war. Musk is suing OpenAI for $134 billion and made an unsolicited $97.4 billion acquisition bid that Altman dismissed as an attempt to slow OpenAI down. Through DOGE, Musk controls how the Trump administration handles AI policy and federal contracts. The administration has banned Anthropic from federal use and consolidated government AI contracts toward OpenAI, with the State Department, Treasury, and HHS switching to GPT-4.1 under Trump’s directive.

So here is the structural reality: Musk’s DOGE used OpenAI’s ChatGPT to destroy the NEH’s independent AI ethics research programme. The Trump administration simultaneously directed federal agencies to use OpenAI products exclusively. OpenAI received a $200 million Pentagon contract. OpenAI’s research funding programmes are now among the primary sources of support for the humanities and ethics scholarship that the NEH used to independently finance.

These facts do not require a conspiracy to be damaging. They describe a competitive landscape. Independent, publicly-funded humanities research on AI’s social implications – research with no obligation to produce commercially useful conclusions – is being replaced by privately-funded research housed at institutions that depend on OpenAI for compute access, grants, and curriculum development resources. The New York Times documented how DOGE unleashed ChatGPT on the humanities. The paper did not trace where the replacement funding is coming from.

What Independent Research Costs When It Disappears

The NEH’s humanities AI research programme was specifically designed to fund perspectives that the technology industry would not. Critical AI scholarship – on algorithmic bias, surveillance infrastructure, the concentration of AI power, the ethics of autonomous weapons – requires institutional independence. Researchers funded through OpenAI’s NextGenAI consortium are not going to produce work that OpenAI’s Pentagon deal makes commercially awkward. That is not because they are corrupt. It is because funding relationships create incentive structures, and incentive structures shape research agendas over time.

The five NEH Humanities Research Centres on Artificial Intelligence – at Bard, NC State, UC Davis, Oklahoma, and Richmond – were examining AI’s human and social impacts with no financial relationship to any AI company. DOGE terminated their grants. The researchers at those institutions now compete for funding from the same private sources that OpenAI and its backers control.

Sam Altman donated to Trump’s inaugural committee. OpenAI pledged $500 billion in U.S. AI infrastructure investment, announced alongside Trump at the White House. The company secured federal government AI contracts across multiple agencies. At each step, OpenAI positioned itself as the administration’s preferred AI partner – while DOGE, run by OpenAI’s most prominent legal adversary, was eliminating the publicly-funded research that might hold any of them accountable.

What This Actually Means

The people who built ChatGPT are now among the primary funders of the institutions equipped to study ChatGPT’s social implications. That is not a coincidence of timing. It is the predictable outcome of destroying public research infrastructure while private alternatives are already in place.

OpenAI is not villainous for funding university research. The $50 million NextGenAI consortium is a legitimate educational initiative. The People-First AI Fund distributes real money to real nonprofits. But legitimacy and structural conflict can coexist. When the only well-resourced humanities scholarship on AI is housed at institutions financially dependent on OpenAI – while the publicly-funded alternative has been cancelled using OpenAI’s own tool – the independence of that scholarship becomes structurally impossible, regardless of individual researchers’ intentions.

The New York Times covered how DOGE used ChatGPT on the humanities. The story it did not tell is about who fills the vacuum. That story requires following the money – not to a scandal, but to a landscape where critical scrutiny of AI’s most powerful actors has been made structurally dependent on those same actors’ goodwill.

Sources

The New York Times | OpenAI | OpenAI Foundation | ODSC | NEH | Slate

Related Video

Related video — Watch on YouTube
Read More News
Mar 16

Oil at three digits is the tax nobody voted on

Mar 16

Wall Street is treating Middle East chaos as just another trading range

Mar 15

The Buried Detail About Oscars Eve: Who Was Not Invited

Mar 15

Why Jeff Bezos at the Chanel Dinner Is a Power Play, Not Just a Photo Op

Mar 15

The Next Domino: How Daytona’s Chaos Will Reshape Spring Break Policing Everywhere

Mar 15

Spring Break Crackdowns Are the Hidden Cost of Daytona’s Weekend Violence

Mar 15

What We Know About the Daytona Beach Weekend Shootings So Far

Mar 15

“I hate to be taking the spotlight away from her on Mother’s Day”, says Katelyn Cummins, and It Shows Who Reality TV Really Serves

Mar 15

Why the Rose of Tralee-DWTS Crossover Is a Ratings Play, Not Just a Feel-Good Story

Mar 15

“It means everything”, says Paudie Moloney, and DWTS Is Betting on Underdog Stories Like His

Mar 15

“Opinions are like noses”, says Limerick’s Paudie, and the DWTS Final Is Already Decided in the Edit

Mar 15

Why the Media Still Treats Golfers’ Private Lives as Public Content

Mar 15

Jaden McDaniels and the Hidden Cost of ‘Simplifying’ in the NBA

Mar 15

The Next Domino After Sabalenka-Rybakina Indian Wells: Who Really Loses in the WTA Rematch Economy

Mar 15

Bachelorette Season 22 Review: Why Taylor Frankie Paul’s Casting Is the Story

Mar 15

Why Iran and a Republican Congressman Shared the Same Sunday Show

Mar 15

Sabalenka vs Rybakina at Indian Wells: What the Head-to-Head Stats Are Hiding

Mar 15

Taylor Frankie Paul’s Bachelorette Arc Is Reality TV’s Favorite Redemption Script

Mar 15

La Liga’s Mid-Table Squeeze Is Making the Real Sociedad-Osasuna Clash Matter More Than It Should

Mar 15

Ludvig Aberg and Olivia Peet Are the Latest Athlete-Couple Story the Tours Love to Sell

Mar 15

Why Marquette’s Offseason Matters More Than Its March Exit

Mar 15

All We Know About the North Side Chicago Shooting So Far

Mar 15

Forsyth County Freeze Warning: What We Know So Far

Mar 15

Paudie Moloney DWTS Underdog Arc Is a Political Dry Run the Irish Press Won’t Name

Mar 15

Political Decode: What Iran’s Minister Really Wanted From the Face the Nation Sit-Down

Mar 15

What We Know About the Taylor Frankie Paul Bachelorette Timeline So Far

Mar 15

What’s Happening: Winter Storm Iona, Hawaii Flooding, and Severe Weather Updates

Mar 15

Wisconsin Winter Storm Updates As Of Now: What We Know

Mar 15

Oklahoma Wildfires and Evacuations: All We Know So Far

Mar 15

What Everyone Is Getting Wrong About Tencent’s OpenClaw Hype Before Earnings

Mar 15

OpenClaw and WorkBuddy Are Less About AI Than About Tencent’s Next Revenue Bet

Mar 15

Why the Bachelorette Franchise Keeps Casting Stars With Baggage

Mar 15

The Transfer Portal Is Forcing Coaches Like Shaka Smart to Recruit Twice a Year

Mar 15

Jaden McDaniels’ Rise Exposes How Few One-and-Done Stars Actually Stick in the NBA

Mar 15

The Timberwolves’ Jaden McDaniels Gamble Failed Because the Roster Was Built for One Star