Local investigative reporting will make money again » Nieman Journalism Lab
Everyone knows the eulogy: Information deserts. Hollowed newsrooms. “Benevolent” billionaire owners. Fickle non-profit donors. All leading to the ceaseless shuttering of local and regional newsrooms across the country.
And it’s true. In many ways, local investigative journalism — the kind that takes time, costs money, has uncertain outcomes, and can truly make an impact — seems well and truly fucked.
But the elegiac pallbearers of print may be too pessimistic. My prediction for 2026? In the era of disinformation and AI slop, we’ll see hard-hitting, deeply reported investigative stories relevant to real people and real communities will once again be seen as valuable. Not just journalistically, but economically as well.
This isn’t a pollyannaish take. I’m not suggesting that people will suddenly stop turning to TikTok for news, or that AI won’t displace commoditized types of reporting. Agents can crawl Nextdoor and Facebook. Scrapers can track city hall calendars. LLMs can be tasked with monitoring police blotters.
But our increasing understanding of what AI can do well — summarize, synthesize, and automate — also throws into relief what it can’t do, at least not yet: Actual truth-seeking.
It can’t break a story a corporation is trying to bury, find a fact that is intentionally being hidden, saddle up next to a source at a bar to hear something new — something nowhere on the internet or in a machine learning training corpus.
That’s because AI relies exclusively on information that already exists. Which, over time, will become more and more of a problem, as an increasing share of the data LLMs ingest will be written by LLMs in the first place, creating a recursive feedback loop of content with epistemically dubious provenance. A copy of a copy of a copy.
Without fresh, verified facts entering the system, AI struggles to get smarter.
This is why old school reporting — boots-on-the-ground, first-hand, verified — may become the scarcest input in the information economy: the rare earth minerals of the AI age, dug up by human beings.
But who is going to pay for that work, amid declines in advertising, subscriptions, and philanthropic support? At Hunterbrook Media, we think we’ve found one way.
Our model pairs investigative journalism from our newsroom with two other businesses: a fund, Hunterbrook Capital, that can invest based on Hunterbrook Media’s reporting; and a litigation business, Hunterbrook Law, that can use our reporting as the basis of lawsuits against bad actors.
Many of our stories don’t lead to litigation or trading, but those that do can be potent enough to support the entire platform. And every article includes disclosures about how Hunterbrook could make money from its reporting.
Basically, we get paid to be right, to be rigorous. Because if the story is wrong, the investment or the lawsuit fails. Markets and courts, while severely flawed, are still fairly good at getting to the truth. And equally important, because our model doesn’t depend on how many clicks we get, we can tell stories that matter to specific communities, geographies, or sectors — not just those that appeal to the lowest common denominator to pick up ad dollars. (We have yet to publish an investigation into Taylor Swift.)
One recent example took us to Danville, Illinois, a struggling, post-industrial town in the Midwest, where we spent four months talking to dozens of locals and former workers at a plant owned by Viscofan, the world’s largest producer of artificial meat casings. (A literal “how the sausage is made” story.)
We obtained scores of photos and videos from inside the factory. We analyzed EPA records, property data, and emissions reports. We mapped obituaries of former workers who died with cancer, neurological disease, and cardiovascular damage. We talked to people who aren’t terminally online and pulled files that ChatGPT and Gemini can’t access. We flew a drone over the factory.
What we found: a company poisoning a town. Clean Air Act violations. Stack fans disconnected from air scrubbers, venting fumes directly into the atmosphere, an almost literal smoking gun. A busted union.
No AI told us this. No algorithm surfaced it. The only way to know was to go.
And then, the impact.
A newspaper in Spain read our investigation and found that Viscofan was polluting in Europe, too. The company hired a law firm to investigate its practices. It lost hundreds of millions of dollars in market cap, forcing its shareholders, executives, and board members to focus on the issue.
(In the article, we disclosed: Hunterbrook Media’s investment affiliate, Hunterbrook Capital, did not have any positions related to the article at the time of publication. But based on Hunterbrook Media’s reporting, Hunterbrook Law was in conversations with firms regarding potential litigation on behalf of victims.)
The Danville story isn’t an exception. It’s a template.
We’re now building a nationwide database of polluters poisoning American communities — which has led to other exceptional local investigative reporting as well.
And we’re hiring (email: [email protected]) more reporters who will do what we did in Danville: show up, ask questions, and tell stories that matter. It’s the kind of journalism that creates value precisely because it cannot be replicated by machines, which is also why our newsroom has collaborated with everyone from ProPublica to Pablo Torre. Because the work we do is best in class.
And while Hunterbrook is monetizing this reporting through trading and litigation, I believe that in 2026, newsrooms will begin to capitalize on high-quality reporting in all kinds of new ways. Think: AI licensing, intelligence, whistleblower bounties, and other business models we haven’t even envisioned yet. The common denominator: the profit engine isn’t how many ads or subscriptions you sell, but the value of the information you find.
And hey, along the way, you might reach some eyeballs, too — including on the AI platforms that are desperately in need of well-researched, well-sourced reporting on the ground, all across the world:

