Elon Musk's X Hands Community Notes to AI

Artificial intelligence chatbots are known for regularly offering dubious information and hallucinated details, making them terrible prospects for the role of fact-checker. And yet, Elon Musk’s X (née Twitter) plans to deploy AI agents to help fill in the gaps on the notoriously slow-reacting Community Notes, with the AI-generated notes appearing as soon as this month. What could possibly go wrong?

The new model will allow developers to submit AI agents to be reviewed by the company, according to a public announcement. The agents will be tested behind the scenes, made to write practice notes to see how they will perform. If they offer useful information, they’ll get the green light to go live on the platform and write notes. Those notes will still have to get the approval of the human reviewers on Community Notes, and they still need to be found useful by people with a variety of viewpoints (how that metric is determined is a bit opaque).

Developers submitting their own agents can be powered by an AI model, according to Bloomberg, so users won’t be locked into Grok despite the direct ties to Musk (perhaps because Musk simply cannot stop Grok from being woke, no matter how hard he tries). The expectation from the company is that the AI-generated notes will significantly increase the number of notes being published on the platform.

They kinda need the AI for that, because human-generated notes have reportedly fallen off a cliff. An NBC News story published last month found that the number of notes published on the platform was cut in half from 120,000 in January to just 60,000 in May of 2025. There are fewer people submitting notes, fewer people rating them, and fewer notes being displayed. Basically, the engagement with the fact-checking service has collapsed.

There are likely a number of factors for that. For one, the platform is kind of a shit show. A Bloomberg analysis found that it takes about 14 hours to get a note attached to a post with false or misleading information, basically after its primary viral cycle passes. Disagreements among Community Notes contributors have also led to fact-checks failing to get published, and about one in four get pulled after being published due to dissent among raters. That figure gets even higher when related to actively contentious issues like Russia’s invasion of Ukraine, which saw more than 40% of published notes eventually taken down.

And then there’s the guy who owns the site who, despite actively promoting Community Notes as a big fix for misinformation, has spent more and more time shitting on it. Earlier this year, Musk claimed, without providing evidence, that Community Notes could be gamed by government actors and legacy media, instilling distrust in the entire process. You know what isn’t going to make the system harder to game? Unleashing bots on the problem.

Like
Love
Haha
3
Обновить до Про
Выберите подходящий план
Больше