In the platform age, influence is no longer about persuasion alone; it is about reach, engagement, and spectacle.
In March 2026, a series of unusual videos began circulating widely across social media. They depicted American President Donald Trump and Israeli Prime Minister Benjamin Netanyahu as LEGO-style characters, placed in surreal, often disturbing wartime scenarios. Set to catchy, AI-generated music, these clips blended humour, satire, and horror—showing bombed schools, toy soldiers marching into rivers of blood, and miniature coffins draped in national flags.
Some of these videos were even aired on Iranian state television, while others spread rapidly online through accounts claiming to be independent creators. Whether produced by state-linked entities, loosely affiliated groups, or opportunistic digital actors, the origin of the content was often unclear. What was evident, however, was its impact: the videos travelled fast, reached millions, and sparked widespread engagement.
The ambiguity surrounding such content is not incidental—it is central to how modern propaganda operates. Groups like the so-called “Explosive News Team” claim independence even as their narratives align closely with state messaging. Meanwhile, official accounts increasingly adopt similar visual languages, blurring the boundary between grassroots creativity and government propaganda. This convergence complicates accountability. When platforms remove such content, they must justify whether it constitutes manipulation, propaganda, or simply viral art.
What distinguishes this new wave of propaganda is not just its message but its form. War is no longer communicated solely through speeches, reports, or traditional media. It is packaged as memes, animations, and short videos that borrow heavily from popular culture. The White House itself has experimented with formats merging military footage with video game aesthetics, referencing titles like Call of Duty or Grand Theft Auto. In India, television channels often termed “Godi Media” have adopted similar strategies, reporting war updates with the cadence of sports commentary—sometimes even reducing human loss to scoreboard tallies.
At the heart of this transformation lies the logic of the attention economy. On social media, the value of content is determined not by accuracy or authority but by its ability to capture attention. The most successful content combines familiarity with shock—recognisable formats placed in unexpected contexts. A LEGO animation of a bombing or a meme-styled war clip is more likely to be watched, shared, and discussed than a conventional news report. Crucially, users do not need to agree with such content to amplify it; they only need to find it compelling.
This evolution has been a decade in the making. As early as 2015, ISIS was producing highly stylised recruitment videos that borrowed from gaming and cinematic aesthetics to appeal to younger audiences. In 2020, China’s Xinhua News Agency released a LEGO-style animation critiquing the United States’ handling of the COVID-19 pandemic. Russia later adopted similar strategies in its campaigns around Eastern Europe. In India, the BJP’s IT cell experimented with meme-driven propaganda, soon replicated by mainstream media outlets. Across these examples lies a shared understanding: in the digital age, influence depends less on what content says than on how far and fast it travels.
Satire and humour have become particularly potent tools in this landscape. They are engaging, easily digestible, and difficult to counter. A factual rebuttal to a meme or animated parody often appears slow, overly serious, and mismatched in tone. As a result, spectacle tends to outpace substance. Viral content reaches wider audiences than detailed reporting, shaping perceptions before facts can catch up. Recent figures illustrate this imbalance: social media videos related to ongoing conflicts have generated billions of impressions—far exceeding the reach of traditional news coverage. For many users, war is encountered first as content, and only later, if at all, as verified information.
Generative AI has accelerated this shift. It enables rapid production of high-quality content at minimal cost, allowing both state and non-state actors to flood platforms with competing narratives. Attribution becomes difficult, as governments, proxy groups, and independent creators produce similar content, often reinforcing one another’s messages. For platforms, this raises difficult questions: What constitutes propaganda? Where is the line between creativity and manipulation? And who decides?
The effectiveness of modern propaganda cannot be measured solely by whether it changes minds. Its impact is more subtle: it shapes the environment in which people interpret events. Viral content influences what feels important, what appears credible, and what emotions are associated with a conflict. It creates a shared atmosphere defined by spectacle, repetition, and emotional resonance. In this context, propaganda is not just about persuasion—it is about occupying attention.
As media theorist Jacques Ellul argued decades ago, propaganda evolves with the systems that carry it. In today’s algorithm-driven ecosystem, it increasingly takes the form of content designed to travel—fast, far, and widely. The implications are profound. When memes, animations, and viral clips become the primary medium through which people encounter complex geopolitical realities, the line between information and entertainment dissolves. The question is no longer just what people believe, but what they see—and how often they see it. In an age where virality determines visibility, the most powerful message is not necessarily the most truthful one. It is the one that travels the furthest.
AI-driven propaganda marks a shift where attention, not accuracy, shapes how conflicts are understood. As war is consumed increasingly through memes and viral content, complex realities risk being reduced to spectacle. This blurs the line between information and entertainment, making it harder to distinguish truth from manipulation. Addressing this challenge requires not only platform accountability but also stronger media literacy among users. Ultimately, the danger is not just that propaganda spreads; it is that it reshapes how reality itself is perceived. And in an age where the meme often becomes the message, the struggle for attention may prove just as consequential as the struggle for truth.
---
*Feelance content writer and editor based in Nagpur; co-founder of TruthScape, a team of digital activists fighting disinformation on social media
Comments