Editor’s note: Because D’Souza’s proposal centers on transparency and accountability, we are publishing the full transcript of our conversation, which took place on April 14, 2026. This transcript has been lightly edited for clarity and length. You can read the related story here.
Aron D’Souza: I actually took a lot away from our conversation that we had last week. And I think it’s very intellectually interesting — if you had all the resources in the world, how would you reinvent journalism, or let’s say truth-telling, to improve the quality of our society?
Rebecca: Let’s start at the top. What is your company that you’re launching? What’s it called? How much backing have you got? Who are your backers, and what problem is it trying to solve?
Aron: Let me start with the problem. In 1970, according to Gallup, courts, scientists, and journalists were all about equally trusted — between 70 and 80% of Americans trusted those three institutions. Today, trust in courts is about the same, scientists have gone down a little bit since the COVID pandemic, but trust in journalism has plummeted from 70% to 30% in 50 years. That’s the core problem I’m trying to solve. The company we’re building is called Objection, and it uses AI plus human investigators to fact-check any public reporting. It’s the first accountability system that’s been created for journalism writ large. This builds on the experience I had leading the Gawker lawsuit for Peter Thiel, where it took 10 years and $10 million for Hulk Hogan to get justice. That’s too slow and too expensive for most people to access. I thought to myself, let me build a system that allows everyone to get access to justice, access to fact-finding, access to truth, much cheaper and more efficiently. We’ve raised a multimillion-dollar seed round with Peter Thiel, Balaji Srinivasan, and a few other venture capitalists.
Rebecca: Okay, and you’re launching this week?
Aron: We are launching tomorrow.
Rebecca: So this is a platform that gives people access to object to reporting?
Meet your next investor or portfolio startup at Disrupt
Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $410.
Meet your next investor or portfolio startup at Disrupt
Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $410.
Aron: Yes. The first question is a philosophical one: What is truth? I like to say that truth is not a vibe, truth is a process. The two very good processes for finding truth in our society are courts and the scientific method. Adversarial courts, where you have a plaintiff and a defendant, often with two opposite sides of the story, with mountains of evidence being adjudicated by an impartial judge. On the other hand, the scientific method, which is about the repeatability of an experiment — if you’re going to write a story about the launch of Objection and you’re an objective journalist, ideally another objective journalist should write almost identically the same story. These are two very good methods of finding truth, and ones that are very trusted by our society, which can be useful tools for reimagining how we can find truth. Because this is ultimately the most difficult problem our society faces. We cannot have a civilization without a shared sense of truth, and we don’t have an agreed system of truth-finding. I’m trying to build one.
Rebecca: Why is this not just a better-funded, AI-enabled version of Pravda? Elon Musk, 2018 — he went on a Twitter rant after getting some negative coverage about Tesla, wanting to build a reputation system for essentially disciplining critics.
Aron: As far as I know, he never even built that.
Rebecca: Right, so this feels similar.
Aron: The criticism that billionaires are involved in this project — well, virtually every media outlet is owned by a billionaire. Your own outlet is owned by Apollo Global Management, Leon Black, who has his own checkered history. [Disclaimer: TechCrunch is no longer owned by Apollo.] Your answer to that is a division of editorial and advertising responsibilities, and I think in a similar way one can answer that criticism: There is a division between my investors and the software that we are building.
Rebecca: That’s not my question. My question was: this is like Pravda — this is very similar to Pravda, and it’s something you’re actually implementing.
Aron: Pravda was never built. He went on this Twitter rant, and I thought it was intellectually very interesting, because one of the core tenets of journalism is that journalists hold power to account. Who holds journalism to account? We would all be uncomfortable if that were government; you end up in a Chinese Communist Party-style apparatus. So what is the effective, private-sector-driven approach? The important material difference — because Pravda never published white papers and I don’t really know how they would have approached it — is that we are building a trustless system. I use the analogy of Encyclopedia Britannica versus Wikipedia. Britannica said: we have high editorial standards, we only have the most robust professors from Oxford and Cambridge writing our articles, and we have a storied editorial board, and a serious publication process. Wikipedia came up with a radically different approach: don’t trust the people, trust the software. By allowing everyone to contribute as easily as possible, it was proven — after only a few years — by the journal Nature that Britannica had more errors than Wikipedia. [Fact check: The Nature report found that Wikipedia was just about as good as Britannica, not that it had fewer errors.] We are building something that requires no trust in me, no trust in my investors. It requires trust in a process that is fully documented on our website, and you can download all the technical white papers.
Rebecca: You say trust the system, but the system itself is built and governed by people, including you. What could make someone trust this system?
Aron: Just look at the technology, look at the process. We don’t say we don’t trust Wikipedia because we don’t trust Jimmy Wales as a person.
Rebecca: No, but I don’t know who Jimmy Wales is. I do know who Peter Thiel is, and I know who Balaji Srinivasan is. Peter Thiel funded a lawsuit that bankrupted a media company, and you led that lawsuit. Why should journalists trust a system that he backs to be neutral?
Aron: Would you call the accountability brought upon Gawker as a negative — they published an unauthorized sex tape, and a free and independent jury in Pinellas County—
Rebecca: That wasn’t my question. I’ll answer yours if you answer mine.
Aron: Okay — reiterate the question for me.
Rebecca: Why should journalists trust a system that is backed by Peter Thiel and run by the person who brought down a media company? Balaji Srinivasan is also pretty anti-institution, pro-network-states. This isn’t necessarily neutral infrastructure. It’s coming from actors with a history of hostility towards the press.
Aron: I think it’s a history of healthy skepticism. And the institutions that have been built to fact-check — like ProPublica, etc. — haven’t actually done a very good job. There’s been a lot of money put into fact-checking, particularly on social media, for the last decade.
Rebecca: What makes you think they haven’t done a good job? I mean, PolitiFact exists, ProPublica exists — I don’t think they’re out here publishing mistakes or misinformation.
Aron: If they were actually effective at what they were doing, the decline in trust in media would have ceased.
Rebecca: I agree to disagree there. I think the fact that trust in media has dropped doesn’t necessarily mean there’s no truth in news media. The problem is perception, and that perception has been shaped by powerful people with large audiences claiming “fake news” because they don’t like news reporting on their bad behavior.
Aron: But then why has that same pattern of behavior not affected trust in, say, science or courts? President Trump, for example—
Rebecca: We’ve got MAHA going on. I think there is a lot of mistrust of science today.
Aron: That’s not borne out by the evidence. If you look at the Gallup poll on trust in science, courts, and journalists — scientists do take a hit post the COVID pandemic. But courts, for example: the president, who has a much larger audience than the Supreme Court Justices, is constantly critical of the court system, yet we have very strong trust in the court system regardless. Powerful actors routinely criticize courts, yet trust in courts is remarkably good — even, I would say as a lawyer, too high, because although courts are very good at getting to the truth, that process is very expensive and very slow.
Rebecca: So the process is also slow and expensive?
Aron: The one thing we can definitely agree on is that lawyers don’t deserve to get paid $2,000 an hour for what they do and for dragging out what should generally be a very simple process into something that takes decades. As someone who is a lawyer and a legal academic — when you actually see the practical consequences of how a defamation or libel trial works, it’s disgusting. The entire legal system is designed to benefit the lawyers. Everything takes longer than it should, everyone’s billing by the hour, and you end up with a situation where individuals have no access to justice. Peter Thiel’s client Hulk Hogan was a single-digit millionaire and he couldn’t afford to sue Gawker. We can discuss the merits of Gawker and the First Amendment implications, but at the end of the day, someone who’s a variable celebrity should be able to access justice, and I think that is a major failure of the current environment.
Rebecca: That’s not the journalist’s fault. Publishing someone’s nudes nonconsensually — I don’t call that journalism. Your platform seems to be focused mainly on journalists. You say this is about truth and accountability. Is this really aimed at journalists, or is this anyone with mass influence, including podcasters?
Aron: You’ll see there is a live test case about Joe Rogan on the platform. Joe’s show is one I’ve been on, and his format engenders trust in his audience in a new and unique way. But he does have a very large audience and exerts a large amount of power, so it would be inappropriate to exclude someone like that. Podcasters, YouTubers, TikTokers — under-30s trust TikTok as much as they trust the New York Times for their news. It’s scary.
Rebecca: Why is your pitch centered on holding journalists accountable rather than on the people and platforms with larger audiences and looser standards? Journalists do have a code of ethics. It’s not that journalists aren’t held accountable — there is a peer review process happening within organizations, anyone can bring a defamation suit, anyone can respond with rebuttals or reporting that counters their arguments.
Aron: Those are fair points. But the nature of social media is that many platforms have addressed the content creator misinformation problem. Community Notes on X have been very effective.
Rebecca: Community Notes are also used for articles that are posted.
Aron: And Community Notes is a really good example of a trustless system — the entire source code is available to read on the X website. I will challenge you to read it, because it is complicated, but it’s extremely well thought out and nuanced in such a way that no one person can control. Yes, there are lots of problems out there. I’ve chosen to focus on legacy media in particular because the gatekeeping of saying there are editorial standards and robust internal processes doesn’t mesh well with the modern world. When I talk to journalists off the record over a glass of wine, they always say it was great to be a journalist in the ’80s — big expense accounts, large budgets, a lot of time for in-depth investigative reporting. Because of structural changes in news media over the last 30 to 40 years, that’s just not possible. Part of that economic change has meant incentives have changed. In 1990, the job of a writer at the New York Times was to sell subscriptions. Now, journalists have to be very focused on getting clicks, individual headlines, what goes viral — they’re beholden to algorithms. That has fundamentally changed the incentives.
Rebecca: I feel like my job is being mansplained a little here. That’s not necessarily how we operate. Of course you want a headline people will want to read, but I don’t sit here thinking, what story is going to go viral and get me a lot of clicks? I sit here thinking, how can I hold power to account? How can I report accurately and truthfully about trends that I’m seeing? A lot of our stories don’t do well, and a lot of them do — it’s actually really unclear what makes stories perform. Sometimes the most mundane story generates the most views. There’s some truth to what you say, of course. But just because some models have shifted toward more click-driven stories doesn’t mean there’s necessarily less truth within the story itself.
Aron: When budgets were larger and incentives were different, it was much easier to write a truthful, non-clickbaity story, not beholden to algorithms. And it’s really disappointing that journalists in America earn less than a new Uber driver. It’s an important profession that serves the public interest — it’s literally the only profession outside of government enumerated in the Bill of Rights.
Rebecca: It’s the Fourth Estate. You say it’s an important profession. Can a journalist fully comply with your system while protecting a source’s identity?
Aron: I actually thought about that from our last conversation, and there’s a technological solution here. I’d love your opinion on it because I asked my team to build it. And this is the beauty about building software, is that you can iterate rapidly.
Protecting a source’s information is a vital way of telling an important story, but there’s an important power asymmetry: the subject gets reported upon, but there’s no way to critique the source. And there’s no capacity to observe the editorial process, because as an outsider, we are not privy to those editorial meetings where editors, lawyers, etc. have approved the use of that source. What I’ve asked my engineers to build is a cryptographic hash. You could come onto Objection, upload information about the recording from the source, pieces of identity verification, etc. The AI will read that set of data and say, okay, this is high quality reporting — and Rebecca, you are hereby issued a certificate to say you can use that anonymous source in a certain way, and it’s been independently verified in a fully trustless, open source system. There are some limitations from an engineering perspective — it requires AI models, so if you’re writing about Sam Altman, we couldn’t pass that data into OpenAI. But there are enough foundational models that the problem could be solved.
Rebecca: Okay, let’s take a step back for a minute because I don’t know that we’ve fully explained the model to the reader. You can pay to object to a piece of content — whether that’s an article, a podcast, a YouTube video. Once that objection is up, there’s a combination of human and AI…?
Aron: Human investigators, mostly ex-law enforcement professionals: CIA, FBI, MI6 agents.
Rebecca: Have you hired these people already?
Aron: Yes. It’s basically like a fleet of Uber drivers — contractors and freelancers, just like most journalists are freelancers. There are some investigative journalists too who have worked at important publications. They go in and investigate the case. They look at the article line by line, sentence by sentence, claim by claim. Anyone who’s been quoted on the record gets a call: did you actually say that? Was it taken out of context? Is it a full, fair and accurate analysis? Importantly, every piece of information they capture goes into a public data room. So even a simple article, like Joe Rogan endorsing ivermectin, now has hundreds of pieces of evidence attached to it.
Rebecca: Okay. When you get to the point where there are anonymous sources — which I should add are important for getting out stories that are truthful, and for whistleblowers who want to hold power to account, who want to say if there’s something bad happening in business or in government — what happens there?
Aron: We have a ratings rubric for evidence. A level one piece of evidence is an unimpeachable primary source document, something that could be tendered in court. All the way down to a level five piece of evidence, which is a rumor.
Rebecca: Most reporters will not report on rumors.
Aron: Most won’t, but there have been plenty of rumored capital raises and deals reported on in my lifetime.
Rebecca: Capital raise deals are not rumors. Those are based on someone with inside knowledge of the deal leaking that information, usually for their own gain.
Aron: Yeah, sometimes it’s a banker, or—
Rebecca: A company wants to drum up some support for their next raise, whisper whisper.
Aron: And there are often bad incentives going on there. The anonymous source can be cryptographically measured in this fully open source model.
Rebecca: So that would require the journalist to provide information about the source to your platform. But that would never happen. What do you mean “fully open source model”? I would have to provide the name of the person?
Aron: Or as much information as you feel comfortable providing, and it would just give an evidence score on the back.
Rebecca: So this addition, this was based off of our previous conversation? Because previously you said that using anonymous sources equals weaker evidence, and that will give the journalist a lower rating.
Aron: Using a fully anonymized source that hasn’t been independently verified would lead to that, yes.
Rebecca: Anonymous sources are anonymous for a reason. You tend not to provide identifying information about them publicly because it could reveal who they are. A source will always be someone we vet as trustworthy. Sometimes you can mess up. My reporting process is to determine who this source actually is, how they would have that information, whether I can have eyes directly on primary source documents — and they share this with me only because they know none of that information is going public and they’re not going to lose their job or face retaliation.
Aron: Rebecca, you work at a high-quality outlet.
Rebecca: The outlets you attack are also high quality. The outlets you have a problem with are also high quality.
Aron: Well, Gawker, for example—
Rebecca: You can bring up Gawker all you want. I don’t know that a celebrity rag that published Hulk Hogan’s dick pics is the standard by which we want to be critiquing journalists writ large. Would the Pentagon Papers —
Aron: What about Fox News, The Sun, The Daily Mail.
Rebecca: I don’t spend a lot of time reading them, so I can’t say whether or not they are inaccurate. Are they biased? Yes. Do they choose to report certain information over other information? I would say so, and that is the same for publications on the left. Does that make individual journalists less truthful, or hinder the fact that you still need anonymous sources to hold power to account and protect whistleblowers? Like would the Pentagon Papers pass your system? The Facebook Files? The Uber Files?
Aron: Let’s ask the question in court. In the system designated by the will of the people and the Constitution — the judicial apparatus, from the Supreme Court on down — are anonymous sources allowed?
Rebecca: Yes, actually. There are DOJ and court processes that do allow for internal, non-shared sources.
Aron: Only in extremely rare proceedings, never in a commercial trial. Usually it’s about a victim of sexual abuse, a minor, or national security-related proceedings. What’s called an in-camera hearing, or in Britain a Star Chamber. And it’s worth pointing out that in those proceedings, both the prosecution and the defense get access to that source. They get to cross-examine them in private. There is no situation in court where only one side is allowed to present evidence the other side can’t see.
Rebecca: But there’s a fairness doctrine in journalism. It’s not that only one side gets to present — when I have information from a source I’m protecting, I don’t just publish it. I will go to the company and say, here is the information I have. I’m not saying who I got it from, but this is what I plan to publish. What is your response?
Aron: That’s the unique power asymmetry of an anonymous source, because you can only critique the output of the information, not the credibility of the source. If they were on the record, it’s much easier.
Rebecca: There’s still a balancing act that needs to happen. Can you give me a concrete example of a real investigative story that your system would improve?
Aron: I was reading a story in the Financial Times this morning about an early investor in OpenAI — it didn’t quote who that individual was — who was concerned about their business model. This is the number one story in the FT today, and it’s completely based on an anonymous source. We’re not talking about national security here. We’re not talking about protection of a minor.
Rebecca: In a lot of ways, you kind of are. A lot of people think AI has an existential risk to national security — if everything is reliant on systems we can’t control or understand, that’s a national security risk. OpenAI has been sued multiple times after minors used its chatbots and then allegedly committed suicide with the aid of those chatbots. To have an early investor say they don’t agree with the business direction of OpenAI is to bring about a larger conversation: can we agree that incentives change a business structure? You said that for news — it’s the same for big tech. Incentives to keep you clicking, keep you responding, cause a platform that is not necessarily designed to benefit all humanity but to generate more attention.
Aron: What you’re talking about is the ticking time bomb thesis. What is a national security threat — is it an abstract risk far in the future, or is it a terrorist with a bomb walking the streets of New York? Where do you draw the line?
Rebecca: The point is that I don’t see why journalist-protected sources should only relate to an imminent national security threat. They’re there for holding businesses accountable for their actions, too.
Aron: The standard set by our constitutional body — the courts — is that the use of an anonymous source is not allowed.
Rebecca: But journalism surfaces truths before they make it to courts.
Aron: Journalism surfaces truth before it makes it to court partly because courts are so inefficient and slow. That’s one of my core criticisms.
Rebecca: But also there are so many journalists out there. Courts are slow and there’s a whole due process, but they have different goals. For courts: legal liability. For science: reproducibility. Journalism is about public accountability. Different goals require different structures.
Aron: And then who designs and authorizes these structures?
Rebecca: The media has been going for a long time. It has taken a long time to develop the methods required to hold themselves accountable and hold each other accountable.
Aron: I think that’s a fundamental problem, for an institution to hold itself accountable is not true accountability.
Rebecca: They hold themselves accountable because they don’t want to get sued. And the whole world holds journalists accountable. You can publish something and have very powerful people, like yourself, respond.
Aron: I’ll accept that I’m powerful-adjacent.
Rebecca: Journalism has layered verification: multiple sources required, primary documents, rigorous internal fact-checking, legal team review, public accountability through courts and public scrutiny, rebuttals, competing reporting, Community Notes.
Aron: And we’re building a system that feeds into that ecosystem. Like Community Notes, that has been a very powerful tool in truth verification, wouldn’t you agree?
Rebecca: Sure, yeah. But you can’t reproduce anonymous sources. They’re anonymous for a reason. Journalists have a duty of care, and they spend years developing those sources. They are deeply personal. This is not something AI can replicate. This is not something another journalist could just slot into. And by downgrading the credibility of anonymous sources, the model inherently feels like it would chill whistleblowing. What would you say to critics who say this is an attempt by big tech billionaires to silence whistleblowers?
Aron: The criticism of big tech billionaires is a moot one — most media outlets are owned by multi-billionaires.
Rebecca: Okay, taking the billionaires away. How do you respond to the idea that this is an attempt to silence whistleblowers?
Aron: It’s an attempt to fact-check; it’s the same as Community Notes. The wisdom of the crowd plus the power of technology to create new methods of truth-telling. Ultimately, when public trust in journalism has declined so much over the last 50 years, someone should be saying there’s something going on here, and there’s a need for a major institutional reform and rethink. I didn’t cause the decline in trust in journalism. This has been happening for five decades. Take a step back. Don’t think of yourself as a journalist. Think about having a blank slate in front of you. How would you create the perfect system using the technology we have today, the distribution mechanics we have today, the advent of artificial intelligence? You wouldn’t just replicate a private equity billionaire owning a media outlet and saying, trust me, this editorial committee is the best way to do it.
Rebecca: I don’t know that I would do what you are doing. I don’t know that I would make it so that whistleblowing becomes a deterrent. I’m sure there are ways to improve journalism — I would give it a whole lot more funding. I trust my colleagues and peers to adjudicate. A lot of us went to journalism school. It is kind of a trade, but there is a code of ethics you must adhere to.
Aron: But there’s no professional licensure. It’s not like being a doctor or a lawyer — you have to pass an exam set by the state, be a member of the Bar Association or the AMA.
Rebecca: Which keeps it even more open. I thought you weren’t a fan of gatekeeping. I’m curious — who will use this the most in its first year? Who do you imagine is the customer?
Aron: The customers are individuals who’ve been misrepresented by the media. I’ve done thousands of interviews over the last two years for the Enhanced Games, and it’s amazing — sometimes they spell my name wrong, get basic details wrong.
Rebecca: And for readers, the Enhanced Games is your other company — basically a version of the Olympic Games that allows for enhancements via drugs or supplements. Go on.
Aron: When there are basic errors, that concerns me.
Rebecca: Basic errors are human. But was there a specific moment — an article that mischaracterized you as a person?
Aron: Sometimes, yes, certainly.
Rebecca: I mean, journalism is up for interpretation. There is no one truth, especially when things are evolving constantly. Not everything’s binary.
Aron: That’s actually the agree-to-disagree point. I don’t believe we are condemned to live in a post-truth era. There are still facts.
Rebecca: There are still facts. But AI determining truth assumes it’s always objective, extractable, computable. Many journalistic truths are interpretive, contextual, evolving. For example: did OpenAI act irresponsibly? That’s not a binary.
Aron: According to Gallup, the principal concern Americans have about news is the blending of fact and opinion, something courts and science do very well. (Note: TechCrunch was unable to find this Gallup poll. A 2018 Pew Research Center report found Americans struggled to distinguish between fact and opinion in news media.) In a scientific article in Nature, you see method, data, and then analysis — very clear signposting that distinguishes fact from opinion. Same in court: when a judge issues an opinion, they go through the facts as plainly as possible and then deliver the opinion. That’s a really important distinction. If you ask me: Have I been mischaracterized? Yes. In questions of fact? Not that often. People say some very mean things. But that’s their opinion.
Rebecca: You say a significant portion of people who’ve been written about in the media feel misrepresented. These aren’t just everyday people; these are powerful people being written about, and maybe they don’t like it. Why wouldn’t a company like OpenAI or xAI file objections about every negative story about them?
Aron: I would actually hope that the journalists writing those stories feel confident — they’ll say, yes, my reporting will stand up to the highest levels of criticism and transparency.
Rebecca: You could say that about a lot of things. There are several advocacy groups that have criticized Elon Musk’s companies, and he has sued them into the ground. Are they operating in truthfulness? I think a lot of the time they are. But they can’t afford the legal fees. And what you’re asking — putting every article up for a score — means journalists have to do extra labor to avoid being diminished in the public eye for work they’ve already done the legwork for. How is this different from SLAPP suits? This is just cheaper and more scalable.
Aron: SLAPP suits only exist in the United States, and one of the reasons they exist is because the cost of litigation is perilous to most people. We cannot have a legal system only accessible to billionaires and the largest corporations. The fact that Elon Musk can in theory sue an advocacy group out of existence — I think that’s truly dysfunctional. Part of what Objection offers is reform of that, because they could take the battle about facts onto Objection, and it would take a couple of days and a few thousand dollars rather than decades and millions. Keeping disputes about facts out of court is the most important thing we can do, because court is not survivable for anyone. It doesn’t benefit anyone except the lawyers.
Rebecca: You say journalism’s model is broken because of bad incentives. What about the incentives of your platform? More disputes equals more revenue. Conflict becomes profitable, outrage becomes monetized. Your business model depends on people attacking journalism.
Aron: No, our business model depends on adjudication. We want to be a trusted source of adjudication, not just for media disputes but for everything. If you have a breach of contract dispute, ideally it gets arbitrated on Objection rather than in court. We make money by adjudicating. We don’t make money by selling clicks and outrage.
Rebecca: At the end of the day, you’re going to get more money the more people submit objections. How can you ensure wealthy actors aren’t just using this to pummel journalists or podcasts or whoever speaks ill of them, which is protected speech?
Aron: You could level the same criticism at legacy media. Rupert Murdoch owns one of the most powerful media outlets in the world and has arguably installed multiple prime ministers in Australia and the UK, and several presidents in the United States. Traditional legacy media suffers from that exact same issue in a different way.
Rebecca: My question is for you, though. Have you built a system that lets powerful actors continuously litigate journalism in public? Do you have any protections in place to ensure people aren’t weaponizing the system?
Aron: The system is so accessible that everyone should be able to use it.
Rebecca: What does it cost?
Aron: It costs as little as $2,000.
Rebecca: $2,000 is not accessible. Most Americans don’t have that.
Aron: You’re owned by one of the largest private equity funds in the world.
Rebecca: You know how private equity works, right? If it doesn’t make dollars, it doesn’t make sense. We’re operating under very slim margins. We do not have money. And being owned by a private equity firm doesn’t mean we have endless resources to litigate stories. What would happen is they would tell me: don’t make this a problem on my balance sheet. Pull back your reporting. That’s most likely what I would be told.
Aron: Or your private equity owners would say, wow, Rebecca, you’re doing great reporting, your trust scores are amazing, we’re one of the most trustworthy outlets out there. And that’s increasing public trust, driving up the subscriber base. You could put it right on your website: one of the most trusted outlets in the world.
Rebecca: Whether or not my overlords would be pleased about a good score doesn’t change the fact that submitting one of these objections is rather expensive.
Aron: To hire a lawyer for one hour of time at a big New York law firm is now $2,000. To prosecute a defamation case is five to ten million.
Rebecca: No, but to submit one of these stories costs $2,000. That is not accessible. We have a world where big tech companies spend millions on lobbying like it’s nothing, to pass laws favorable to their businesses. I can imagine this would just be another form of lobbying, another way to buy influence. I don’t see how normal institutions or everyday people could possibly compete with what big tech could throw at this financially.
Aron: I think $2,000 is very accessible, particularly to the people who are written about in the press — there are 150,000 of them—
Rebecca: Most Americans are living paycheck to paycheck.
Aron: If it were truly an accessibility issue, and someone who felt aggrieved by an article did not have the funds to bring an objection, we’ll give them a complimentary subscription. I’ll pay for it out of my own pocket. There are real costs associated with it — human investigators are involved, it’s not just AI powering a data center — but it’s exponentially cheaper than litigation.
Rebecca: Exponentially cheaper for powerful people who want to attack their critics. Let’s go back to the AI, because I don’t think we’ve established for the reader what it is.
Aron: It’s a jury of foundational models. All the major foundational models — OpenAI, Grok, Anthropic, Mistral, etc. — are prompted to behave as if they are everyday Americans serving as jurors. The jury system is highly trusted by courts, judges, and the general public as an effective way of finding truth. We prompt the individual models to behave as, say, a 50-year-old man in Brooklyn versus a 25-year-old woman in Portland, based on statistical evidence of the demographics of the United States.
Rebecca: Okay. How does the system handle incomplete information, conflicting testimony, or evolving facts?
Aron: That’s the beauty of the adversarial court system, which we model ourselves on. There’s always conflicting facts in an adversarial trial. Truth is not a vibe — truth is a process. Truth is the argument that is better evidenced and better structured. Judges are fallible individuals, but what they’re good at is saying: who is making the better argument in front of me? That’s about sourcing and demonstrating evidence.
Rebecca: You’re asking journalists to expose their underlying evidence. Is your platform open source? Can anyone look at how your AI makes decisions?
Aron: Yes. We want to be as trustless as possible, so the full white paper, methodology, and algorithm are all outlined in excruciating mathematical detail on our website, Objection.ai. Every case ends in a judgment where the AI models outline every step of their reasoning in as much detail as possible, and every piece of evidence they used to reach that conclusion.
Rebecca: So journalists should expose their underlying evidence. Should AI companies be held to the same standard? Should OpenAI, Anthropic, xAI open source their models, disclose their training data, show their reasoning process?
Aron: That’s a very interesting question. I’m not involved in any of those companies, but I think there is an ethical argument that powerful institutions, particularly those wielding potentially monopolistic power, have an obligation of transparency. We ask government officials to disclose every gift they receive, and the Freedom of Information Act applies to every email they send, because citizens are subjected to their monopoly power. In return, we demand transparency, and transparency reduces corruption. Similarly, organizations that wield great power should aspire to as much transparency as possible. I hope you publish this entire interview — it’d be very interesting for a portion of your audience, or they could throw it through an AI model and see if you accurately quoted me. And I do agree, subject to designing the right commercial incentives for the further development of artificial intelligence, there should be some degree of open sourcing. The fact that—
Rebecca: Some degree of open sourcing. But AI is already acting as an arbiter of truth at a much larger scale than journalism. Why should journalists be required to operate with more transparency than the systems increasingly shaping public knowledge?
Aron: That’s a very fair question, and I would say both need to aspire to the highest levels of transparency. It was a vast strategic misstep by Mark Zuckerberg to change the Facebook algorithm from a chronological one to an algorithmic one around 2009–2010. It exploded his ad revenue, of course. And for TikTok, that targeting algorithm is the source of all their power. What Musk almost did at X — exposing how the algorithm works, exposing how the Community Notes algorithm works — is really a powerful gesture in the interest of transparency. There has to be some dimension of commercial incentives, though. Building AI foundational models is very expensive.
Rebecca: When you say commercial incentives, what do you mean?
Aron: The analogy is drug development: the development of pharmaceutical drugs is very important socially, and science builds upon knowledge, so should the recipes be completely exposed publicly? Well, patent protection lasts for 20 years and gives a monopoly right to commercialize a drug, make a profit, and then in due course the patents expire and generics become available. There might be an argument that AI models could follow a similar convention: closed for a short period, probably measured in months not years, during which they can be exploited commercially, and then open sourced so that human knowledge can be built upon them.
Rebecca: Would you say the same thing about anonymous sources?
Aron: Using that train of logic — well, you’d say the anonymous source gets to be anonymous for a period of time.
Rebecca: No, I don’t like that. We’ll have to agree to disagree on that one. I’m curious why the tribunal model of ethical judgment, which feels like a shaming mechanism against individual journalists? You mention science a lot — do individual scientists have a public truth score?
Aron: In fact, they do. It’s called an impact factor rating. The methodology is very similar structurally to what we use, based on chess ELO metrics — it’s about how well cited your article is. If other highly-cited scientists are citing your article, that raises your impact factor rating. This is actually the primary metric by which scientists are judged.
Rebecca: But again, that’s peer review, replication, institutional processes. Scientists are not given a universal truth score by a startup. Why is your model closer to Yelp than to science?
Aron: Science was once a startup. In the medieval era, scientists were viewed as servants of the courts of great monarchs, exploring ideas in private. Then in the early 1700s, the Royal Society of London convened the first scientific journal and created the process of peer review. People could submit articles, they would be circulated among eminent peers who would post their comments publicly, and only after passing rigorous peer review was the article published in the Proceedings of the Royal Society, which over 300 years later is still one of the preeminent scientific journals. It took well over 100 years to move from the old model to the new. With each technological leap, there are new models of truth-telling.
Rebecca: Why should people trust your system when fact-checking organizations, Community Notes, defamation law, and basic public scrutiny already exist? I think a lot of mistrust in media is also just a fundamental misunderstanding of how media operates.
Aron: People can see the full architecture of our system. It’s fully transparent — how every decision is made, how the algorithms work, it’s not a black box. It’s fully documented, and I hope eminent mathematicians, lawyers, and academics will look at those white papers and tear them to shreds and give me feedback. We’ll improve our algorithm. We aspire to a very high level of transparency. I didn’t let journalists into the office at the Enhanced Games because we were doing something that was pushing boundaries, and I thought journalists would be very critical of the sausage-making. But if you want to come sit in our office one day, see how every decision is made, see the source code, talk about the incentives, we’d be very happy to have you.
Rebecca: Maybe I’ll take you up on that. I imagine you wouldn’t be as frank in front of a journalist as you would be in private — which is exactly the problem with whistleblowers. If they knew that their identity might be exposed or downgraded in credibility, they wouldn’t come forward. I’ve done enough interviews with enough companies to know that they hold a lot back, and that’s why we require anonymous sources to figure out what’s actually going on. But let’s not harp on that. What was your actual catalytic moment for this? Not the abstract philosophy. Was it Gawker? With your power and influence, why spend your time building this specifically instead of a transparency tool for AI companies, for social platforms, for political misinformation?
Aron: The Objection platform works for all of those — for AI, for politicians, for legacy journalists. You can take an output from ChatGPT and file an objection against it. We do hope our human-based investigations and AI adjudication will form part of AI training data, and I hope someone will file an objection against content produced by AI models. I wish that Sam or Elon will reply, because I think that would be a very powerful test of the system.
Rebecca: Do you think the Pentagon Papers were not trustworthy because they relied on anonymous sources? Watergate? Deep Throat?
Aron: Those are easy examples to cherry-pick in history of situations where anonymous sourcing worked. But you’re also giving examples from a time—
Rebecca: I don’t have to cherry-pick, I can keep going.
Aron: From a time when, say, the Graham family owned the Washington Post, lavished it financially, and were great stewards of that institution. Would that same thing happen in an era of operating at a loss?
Rebecca: I have a lot of feelings about Jeff Bezos owning the Washington Post. But those aren’t just big historical cases. There have been several Pulitzer Prize-winning stories relying on anonymous sources published way more recently.
Aron: I think the glory days of investigative journalism were in an era where classified ads in particular sustained high-quality reporting, and economic incentives have changed dramatically since then.
Rebecca: I think attention has changed dramatically. We went to the moon this week and nobody cares, because there’s just too much going on. There’s still been high-quality investigative journalism. Hannah Dreier from the New York Times on migrant child labor in the U.S. The New Yorker on how felony murder laws imprison people for crimes they didn’t commit. The Washington Post on the history and impact of the AR-15.
Aron: Do you think the journalism establishment would give a YouTuber like Nick Shirley, who did amazing investigative work on Medicare-Medicaid fraud in Minnesota, a Pulitzer Prize?
Rebecca: I agree that there’s gatekeeping with prizes. But that’s about prizes — not about truthful reporting.
Aron: Well, Pulitzer Prize-winning stories are a heuristic for truth-telling and high-quality investigative journalism.
Rebecca: That’s the standard we all hold ourselves to. That’s what most journalists want to achieve: journalism that makes an impact, that protects sources, that highlights injustice.
Aron: And I agree, and I think virtually all journalists have very good intentions. But it’s the incentives of media proprietors, social media algorithms, and AI amplification that have created the problem.
Rebecca: Social media algorithms, AI amplification. So why not attack those things? If your system makes important stories highlighting injustice and holding power to account harder to publish, is that an acceptable outcome for you?
Aron: If it raises the standards of transparency and trust, that’s a good thing. If those stories which are so important to the functioning of our democracy get passed through another filter, and on the other side the public says, you know what, we trust journalism more — the fundamental point is that only 30% of Americans trust journalists today.
Rebecca: I think we’re just in a post-trust era because they don’t trust AI companies either.
Aron: I think there is a valid criticism of AI companies.
Rebecca: You’re one of them now, technically. So why would anyone trust you? And why would journalists listen? What happens if nobody gives a shit?
Aron: It’s actually a two-sided market, which is the important thing to note. Subjects of media reporting care. I’ve talked to people whose lives have been ruined, whose careers have been destroyed by one article.
Rebecca: Did they do anything wrong to incur the wrath of said article?
Aron: Sometimes they were just an easy scapegoat, an easy heuristic.
Rebecca: More often than not, people highlighted in the press — sometimes there can be a disproportionate reaction, but often when you’re reporting about someone’s misdeeds, it’s unfair to blame the messenger.
Aron: But we also live in an era where it’s impossible to forget. If you did something bad when you were young in the 90s, maybe it was written about in the local newspaper on microfilm at the local library. Now, one quick Google search away, it’s the first thing that comes up.
Rebecca: I think that’s a very human thing. In tribes, if someone misbehaved, they would be banished.
Aron: There’s something very powerful about forgiveness. Think of a particular example — a young man, about 30, who applied for a job at the Enhanced Games. Really smart, passed the interview with flying colors. My head of HR Googled him and found a couple of stories from when he was in college, an allegation, never convicted. And I thought: what if a journalist found this? A sexual assault allegation from college, and the Enhanced Games hired him — it could become a whole story. We did not hire him.
Rebecca: Yeah, but sometimes things are a whole story and then nothing happens. Big Balls, for example — part of Elon Musk’s DOGE team — a young man who frequently posted misinformation, misogynist and antisemitic content. That didn’t lead to his firing, that didn’t lead to anything. There have been countless stories attacking wrongdoing and nothing happened.
Aron: I didn’t talk to Big Balls, but I did talk to another DOGE staff member — I won’t name him — a young man who had one article written about him in the news media, no other public profile whatsoever, and now when I had a meeting with a New York venture capital fund, it was the first thing that came up.
Rebecca: He’s not banished. He works at a venture capital fund. Were the articles about him truthful?
Aron: There were dimensions that were factually correct, but it did paint him in a very negative personal light. People deserve to be forgiven over time. We’re adjusting to an era where it is not possible to forget, and that’s something we need to structurally address.
Rebecca: So if someone has an accurate article, but the framing paints someone in a bad light because of truthful things they did — things people don’t agree with — or because the journalist chose to highlight certain actions and maybe missed the part where they volunteer at a soup kitchen every weekend, on your system, would they get a bad rating?
Aron: Not at all. But distinguishing fact from opinion is very important.
Rebecca: One thing that bothers me is that people think the opinion section of the New York Times is the same as reporting. That’s not the same as journalism.
Aron: I look at the Times and an opinion piece is almost intermeshed on the homepage — it’s in tiny font labeled “opinion” at the top. People say, “the New York Times said X,” as if it were fact, when it’s actually just some lobbyist’s opinion. The Gallup surveys, time and time again, show that distinguishing fact from opinion is essential.
Rebecca: But do you not agree that people have a right to their free speech? Giving them scores, attacking them at a systematic level, where people with money can really hammer you down — how does that not hinder free speech?
Aron: Do you think the Michelin Guide rating restaurants is socially negative? Chefs are artists, should they be subjected to a ratings and scoring system?
Rebecca: They have to opt in to the Michelin system, don’t they?
Aron: I don’t believe it’s an opt-in system. You can reject a Michelin star, sure, but the Michelin Guide is a very useful tool for finding good, trustworthy restaurants. That said, restaurants don’t wield power in the same way The New York Times does.
Rebecca: These are very different stakes.
Aron: But in almost every profession of great importance, there is some kind of regulator. To be a nail technician in the United States, you need a license—
Rebecca: There is a regulator for journalists. If you print lies, you will be held accountable.
Aron: That’s actually a lot less true in the United States in particular. If you publish lies, that’s not necessarily something you can be found liable for, because under New York Times v. Sullivan, you have to show—
Rebecca: Libel, defamation — you have to show the journalist had malicious intent.
Aron: Actual malice. The standard for a civil judgment in the United States is basically impossible to reach, versus in the UK, continental Europe, Australia, and New Zealand, where there is no actual malice standard — it’s much easier to get sued for printing something.
Rebecca: That’s also our First Amendment right to free speech.
Aron: It’s also very effective lobbying done in the 1970s by major media outlets.
Rebecca: Don’t talk to me about lobbying. We’ve got Greg Brockman and a lot of your friends spending tens of millions of dollars in lobbying for small congressional district seats, just to pass laws with the slightest scrutiny for AI companies — who are by some estimation about to become the arbiters of truth.
Aron: I would say yes to that — and we’re friends, right? Mark Zuckerberg has shown himself not to be an arbiter of truth. As has Rupert Murdoch. I don’t think since the 1960s that we’ve really said we trust any single arbiter of truth, and I think that’s very damaging. At any rate, I better go get dinner.
[End of interview]