The feverish search for the next “disinformation” silver bullet continues as several elections are being held worldwide.

Censorship enthusiasts, who habitually use the terms “dis/misinformation” to go after lawful online speech that happens to not suit their political or ideological agenda, now feel that debunking has failed them.

(That can be yet another euphemism for censorship – when “debunking” political speech means removing information those directly or indirectly in control of platforms don’t like.)

Enter “prebuking” – and regardless of how risky, especially when applied in a democracy, this is, those who support the method are not swayed even by the possibility it may not work.

Prebunking is a distinctly dystopian notion that the audiences and social media users can be “programmed” (proponents use the term, “inoculated”) to reject information as untrustworthy.

To achieve that, speech must be discredited and suppressed as “misinformation” (via warnings from censors) before, not after it is seen by people.

“A radical playbook” is what some legacy media reports call this, at the same time implicitly justifying it as a necessity in a year that has been systematically hyped up as particularly dangerous because of elections taking place around the globe.

The Washington Post disturbingly sums up prebunking as exposing people to “weakened doses of misinformation paired with explanations (…) aimed at helping the public develop ‘mental antibodies’.”

This type of manipulation is supposed to steer the “unwashed masses” toward making the right (aka, desired by the “prebunkers”) conclusions, as they decide who to vote for.

Even as this is seen by opponents as a threat to democracy, it is being adopted widely – “from Arizona to Taiwan (with the EU in between)” – under the pretext of actually protecting democracy.

Where there are governments and censorship these days, there’s inevitably Big Tech, and Google and Meta are mentioned as particularly involved in carrying out prebunking campaigns, notably in the EU.

Apparently Google will not be developing Americans’ “mental antibodies” ahead of the US vote in November – that might prove too controversial, at least at this point in time.

The risk-reward ratio here is also unappealing.

“There aren’t really any actual field experiments showing that it (prebunking) can change people’s behavior in an enduring way,” said Cornell University psychology professor Gordon Pennycook.