Featured

Privatization of propaganda. How disinformation became a global business

Operations to influence public opinion in other countries were traditionally strictly controlled by states. Over the past decade, disinformation and manipulation have become services offered for a fee by companies with expertise in fields such as intelligence, military or marketing, an analysis shows.

What used to be a

What used to be a “troll factory” can be done today with a laptop. PHOTO: Profimedia

“A discreet revolution has taken place in the world of propaganda. Operations that were once run by authoritarian governments and intelligence agencies are now being outsourced to private firms that sell disinformation and manipulation as services”say specialists from EU vs Disinfo, the European Commission's portal tasked with dismantling pro-Kremlin disinformation operations.

From fake armies on social media to smear campaigns driven by artificial intelligence, disinformation and Foreign Information Interference and Manipulation (FIMI) have become a global business, giving authoritarian regimes new ways to influence others – and deny it all, the analysis says.

From state propaganda to on-demand disinformation

For decades, information operations were strictly controlled by states. The Soviet Union perfected the art of disinformation; later Russia institutionalized it through modern digital operations such as the Internet Research Agency (IRA).

But in the last decade, this model has been commercialized. Disinformation and deception have become services offered for a fee by companies with experience in fields such as intelligence, military or marketing. These firms, which operate worldwide, sell complete packages of FIMI campaigns that include fake social media campaigns, hacking, data leaks and “narrative management” to spread fake and manipulated content in democratic countries.

This outsourcing provides both efficiency and plausible deniability. Authoritarian states actively seek to transfer information operations to private intermediaries while shielding themselves from diplomatic and legal consequences.

Through this model, malicious actors can experiment with risky tactics such as AI-generated content, hacking or deepfakes – operations that would be politically or diplomatically explosive if carried out directly by state institutions. Thus, they can target foreign populations with customized influence campaigns while maintaining plausible deniability by declaring that they have no ties to the private entities running them.

Outsourcing also enables information laundering — concealing the true origin of disinformation through private firms, fake accounts, and proxy media. As these actors repeat and amplify the message, it begins to feel organic and local. Thus, malicious actors can spread targeted narratives denying any involvement.

It's all the informational equivalent of using mercenaries: the client enjoys the results without taking the blame.

Team Jorge and the merchandising of manipulation

The Forbidden Stories – 2023 press investigation into an entity called “Team Jorge” exposed the inner workings of this new ecosystem of bespoke influence. The firm claimed to have interfered in 33 presidential elections, winning 27 of them. His clients included political parties, corporations and, apparently, actors with state ties.

At the heart of “Team Jorge's” system was Advanced Impact Media Solutions (AIMS), software capable of creating and coordinating thousands of fake social media accounts, complete with synthetic photos, biographies and personal histories. These avatars could be mobilized to flood debates, spread narratives or harass opponents.

Russia continues to be a major player in this outsourced ecosystem. Private companies like the Social Design Agency (SDA) and Structura now run large-scale influencer operations that mirror and in many ways replace the functions of the old St. Petersburg troll factories. These firms manage cloaked online assets, promote state-aligned narratives, and provide the Kremlin with an additional layer of plausible deniability.

Undercover journalists have recorded the firm demonstrating hacking techniques, media infiltration and planting fake news. The scale of these operations and their accessibility to paying customers showed how disinformation has become a global commodity.

“What once required a troll factory and an entire building in St. Petersburg now only requires a laptop”

Modern influencer campaigns no longer exist only online, but operate in the hybrid space between digital and physical reality.

The Internet Research Agency (IRA) demonstrated this in the 2016 US election, when Russian operatives posing as US activists organized real rallies, paid participants and coordinated online amplification. What started as a war of memes ended as physical mobilization.

Today's hybrid operations combine hacking, covertly funded local influencers and shadow-controlled media platforms. The operators of these campaigns build credible news sites and influencers to introduce personalized narratives into the public sphere. Once released, these narratives mix with authentic content and spread across both digital and traditional media, making manipulation difficult to detect.

The original model of troll factories – hundreds of young people posting manually in shifts – is being replaced by AI-based automation.

Systems like Team Jorge's AIMS or new tools powered by large-scale language models can now manage thousands of fake accounts and generate multilingual content tailored to target audiences in real time. AI allows operations that once required hundreds of people to be run by a few operators or even a single person. What once required a troll factory and an entire building in St. Petersburg now only requires a laptop.

Asymmetric information warfare

The emergence of these on-demand influencer firms has created a new strategic imbalance – asymmetric information warfare.

In this asymmetry, autocracies benefit from maximum coverage with minimum risk. At home they are protected from censorship, control and plausible deniability. Democracies, however, are more exposed. Limited by transparency and law, they face maximum vulnerability and limited defensive means.

This imbalance is not only political but also structural. Authoritarian regimes can use disinformation and AI tools to shape global narratives, influence elections abroad, and undermine trust while avoiding direct accountability. Democracies, on the other hand, must defend themselves on open networks designed for freedom of expression.

The stakes for democracy: truth becomes optional and accountability hard to establish

These operations are already reshaping political realities. On-demand influencer firms targeted elections in Africa, Europe and Latin America. Disinformation campaigns amplify polarization, delegitimize media outlets and exploit social divisions to weaken democratic cohesion.

The commercialization of disinformation risks creating a global gray area where truth becomes optional and responsibility hard to establish. As AI tools become cheaper and more capable, these operations will likely continue to grow in scale and sophistication.

Recognizing this asymmetry and responding with resilience and regulation is the only way to prevent truth itself from becoming a commodity.



Ashley Davis

I’m Ashley Davis as an editor, I’m committed to upholding the highest standards of integrity and accuracy in every piece we publish. My work is driven by curiosity, a passion for truth, and a belief that journalism plays a crucial role in shaping public discourse. I strive to tell stories that not only inform but also inspire action and conversation.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button