September 2025 — e-flux
An image of a downed US F-35 jet—wings destroyed, afterburner still glowing—starts to trend.
Millions of people watch a video of Tel Aviv in smoking ruins after an Iranian attack. Footage from a US bombing of Tehran takes the place of breaking news from a strike that is yet to happen. Look closer, and you see strange image errors and artifacts. The grass beneath the F-35 is eerily green and untouched. Things are slightly off. All these images are AI-generated, realistic on first glance, and shared widely on social media and beyond. This is the era of AI slopaganda.
Political leaders are playing a central role in the spread of slop. Supreme Leader Ayatollah Ali Khamenei’s official X account showed an AI image of missile launches rising beneath Quranic verses, viewed over six million times.
President Trump frequently shares generated representations of himself as a rock star, a king, and a gold statue in the center of his Gaza resort.
In a sense, these images are “classic slop”: hyper-photorealistic, game-engine lighting, and geometrically impossible perspectives. It’s a reconstituted AI mash made from stock photos, commercials, Reddit, and Pinterest. Just add water.
Jean Baudrillard once said the simulation comes first and ultimately replaces the real.
This barely seems controversial. AI slop is full of revisionist realities, drawing mass attention before the real bombs drop. But in another sense, AI slop isn’t invested in the order of events or even looking like reality. The slop is not the territory: it just smothers it in synthetic goop. It’s flooding the zone with AI shit.
The AI slop aesthetic is more than a technical artifact or cultural curiosity. It’s the consumer-facing product of computational capitalism—created by transforming human culture into a standing reserve of data. Each uncanny image, synthetic influencer, and AI-generated article is part of a much wider transformation. Slop is waste, but it’s also fuel.
Slop is the product of a different kind of Krebs cycle.
It begins when AI models are trained by devouring billions of images, videos, and texts from the internet, using neural networks with trillions of parameters that learn to identify patterns, relationships, and structures in the data. To digest all this requires staggering amounts of energy. Synthetic outputs are then excreted and then rapidly spread online. Then they are ingested once more, and the cycle continues.
This is the latest in a long evolution of industrial media that has been recognized by theorists and artists over decades. We could think of Vilém Flusser’s “technical images” in the 1980s, or Harun Farocki’s “operational images” in the early 2000s.
Now we have “metabolic images.”
Metabolic images are visual material made by consuming billions of other images to be broken down and absorbed by AI models via a gargantuan infrastructure that absorbs energy and water, and excretes media, atmospheric carbon, and other pollutants. If we consider all the forms of generative AI, including text, images, code, and video, it is fundamentally metabolic media. It has a burn rate that is astronomical and growing, and is now directly competing with humans for basic resources like land, energy, minerals, and fresh water.
I. The Rift
Karl Marx once described the shift from rural to urban living under capitalism as a disrupted metabolic process.
In pre-industrial societies, waste easily cycled back into production—human and animal feces from small farms returned to the earth as fertilizer, maintaining soil fertility and sustaining agriculture. But industrial modernity broke this cycle. As masses of people migrated to cities, the human waste that once nourished fields began accumulating as urban pollution.
People threw excrement into the streets, depriving rural soils of essential nutrients. Soils became depleted, which then required the introduction of artificial fertilizers, exacerbating a cycle of environmental degradation. John Bellamy Foster later called this phenomenon a “metabolic rift”: the systemic disruption of ecological and metabolic processes by capitalist production.
Sociologists like Jason W. Moore trace these metabolic rifts back to capitalism’s origins in the sixteenth century, and they resurface in new forms as each phase of industrial development reorganizes the planet.
The turn toward massive, data-heavy generative AI is driving a new and dramatic metabolic rift. It’s affecting the environment, patterns of work, supply chains, culture, and visuality itself. And it’s filling information ecosystems with torrents of shit, endless digital detritus which is never properly flushed out. Instead, it is fed back into the system as raw material for the next generation of AI models. This is the basic material cycle of AI’s metabolism: consumption and excretion patterns where AI-generated images have material consequences and consequences for our materials.
Now the widespread use of generative interfaces like ChatGPT, Claude, and Gemini is supplanting web-based information sources. Why bother going to a news site or Wikipedia if Claude can summarize it for you? This diverts users away from the web as it was, toward AI-mediated information platforms. The labor of choosing between lists of different web sites is supplanted by the convenience of being given a single, synthesized answer.
This shift is cannibalizing the markets that sustain online content production. Human content creators face dwindling readership, reduced engagement, and disappearing ad revenue. It’s another metabolic rift, severing the relationships between search companies like Google, advertisers, and the broader content economy. In its place, the slop tide will rise. Cursed domains of fake news, psyops imagery, and synthetic influencers designed to generate rage clicks will multiply and capture the paid sponsorships that once belonged to humans.
All this poses a problem for AI companies as well. Multiple studies have shown that AI systems degenerate when they are fed on too much of their own outputs—a phenomenon researchers call MAD (Model Autophagy Disease).
In other words, AI will eat itself, then gradually collapse into nonsense and noise. It happens slowly at first, then all at once. The researchers compare it to mad cow disease.
The industry is scrambling for solutions: engineering higher-quality synthetic data for their digital feeding operations and supplementing training datasets with content from underpaid human laborers. For humans, a new slop economy is being born. From an infrastructural perspective, the whole cycle is threatening to push past what electrical grids, mines, and water tables can withstand.
II. Ingest, Digest, and Excrete
There’s an Apple commercial for iPads which serves as an unintentionally perfect metaphor for AI content ingestion.
A massive hydraulic press slowly pulverizes musical instruments, video games, books, paint canisters, a typewriter, a camera. Each object collapses under relentless pressure. The ad shows the utter inevitability of the process: no instrument or artwork can tolerate the weight. Centuries of culture and creativity are crushed into a fine paste for AI training. This is the AI maw in action: the mass harvesting of images, words, videos, all compressed into the latent space of neural networks. Everything we’ve created or will create, once digitized, is material for the feed lots.
The volume of data required to train generative AI systems is rapidly approaching the limits of what exists on the open web. Some researchers say this will happen by 2026.
Elon Musk argues that we have already reached “peak data.” The AI industry guards the exact details, but reports suggest earlier GPT-3 models were trained on more than 45 terabytes of compressed internet content, at least 3.2 trillion words scraped from websites, books, articles, and social media posts created by millions of human authors. The latest public datasets like Public Domain 12M contain more than 12.4 billion images with text captions.
AI companies now acquire entire social media platforms and negotiate exclusive deals with libraries and academic journal publishers, but it may not be enough to meet the growing appetites of AI models.
Digesting all that media requires the energy of an industrialized country. Estimates of how much electricity will be needed for generative AI is now routinely expressed in the unit of a nation state. The International Energy Agency says that by 2030, AI will consume as much electricity as Japan today.
Bloomberg predicts that AI will need as much power as India by the same date.
But AI’s demands are growing faster than any individual country.
Each generative AI query burns at least ten times more power than traditional web searches. Text-to-image generation is even more energy intensive, while video generation is exponentially greater again. This escalating industrial demand follows the recursive logic of AI development itself: larger models demand more data, which enables even larger models, producing higher resolution outputs, requiring more computing power, consuming more energy—and repeat.
Consider the latest AI hyperscale data center being built in Abilene, Texas for OpenAI —code-named “Project Ludicrous.” When completed, it will span more than seventeen football fields. Elon Musk’s xAI facility in South Memphis, called Colossus, which was built in 2024, has already hit the limit of the local electrical grid. The company installed methane gas generators that emit toxic nitrogen dioxide and formaldehyde into the air. The predominantly Black community who live in the region is already experiencing high levels of asthma and respiratory disease.
And this is merely the beginning. President Trump’s Stargate Project is planning at least twenty giant data centers across the US.
The industry calls them AI factories.
Then there’s water. Like digestive fluid, hyperscale data centers require continuous cooling systems that consume millions of gallons daily. With billions worldwide using generative AI, aquifers of drinking water are being evaporated to cool down AI chips. This represents a vast atmospheric redistribution of hydrological resources taken away from source ecosystems and dispersed as vapor.
Finally, there is a new critical minerals “supercycle” driven by burgeoning AI infrastructure, which is accelerating environmental damage.
Rare earth elements essential for GPUs, predominantly mined and processed in China, leave behind lakes of radioactive waste. Lithium for data center backup systems comes from salt flats in Chile and Bolivia, disrupting local water systems and threatening Indigenous land rights. The cobalt for batteries relies on mining operations in the Democratic Republic of Congo that exploit child labor and operate with minimal environmental oversight.
The expansion of computational capital reproduces the core-periphery dynamics of earlier empires—extracting from the Global South to enrich the North—while excavating billions of years in geological time to increase the speed of algorithmic convenience by milliseconds. After feeding at this planetary scale, the industrial beast now releases a tide of slop.
III. Slop Economics
How are the humans who live downstream of AI models adapting? Do we swim with or against the rivers of synthetic shit? If the metabolic rift decimates our ecosystems—including the information environment—whatever will become of the influencers? They’re transforming into coprophages: digital dung-eaters that consume and monetize slop.
This metamorphosis is already underway. AI-powered personas like Lil Miquela achieve engagement rates far beyond human influencers.
Their creators systematically A/B test visual aesthetics, personality traits, and content forms to maximize dwell time. These characters appear increasingly indistinguishable from living humans while being ruthlessly optimized for audience response and commercial effectiveness. Content creation teams deploy AI tools to generate hundreds of potential posts, images, and videos, then use engagement analytics to identify the most addictive combinations. Instagram accounts are dedicated to AI-generated “rage bait” influencers engineered to provoke, offend, and harvest outrage. These synthetic personalities are continuously evolving based on performance data, with unsuccessful traits eliminated and successful characteristics amplified before being spawned into new accounts.
This industrialization of viral content is a well-established practice under platform capitalism. Instagram doesn’t reveal how much it pays creators once their content spreads widely, but recent reporting suggests it’s roughly $120 per million views.
The system proves so indiscriminate that AI-generated images now earn creators hundreds of dollars per viral hit, transforming Meta’s $2 billion creator fund into an inadvertent subsidy for synthetic content. The platform has thus engineered the perfect economic incentive structure for slop: rewarding whoever can most efficiently capture human attention, regardless whether it’s ASMR fruit-cutting videos, bunnies jumping on trampolines, or war propaganda. Meta calls this “supporting creators,” but with more people monetizing generative AI, it’s sponsoring slop factories.
The global geography of synthetic content production reveals the international division of labor underlying these new economic relationships. Content farms in regions with lower labor costs deploy AI tools to generate thousands of synthetic social media posts, news articles, and product reviews targeting higher-value advertising markets in wealthy nations. It’s a well-oiled form of synthetic arbitrage: the extraction of ad revenue from developed economies through mass production of AI slop in low-income regions. Will AI slop farming ultimately exhaust the soil of online spaces? Will people flee the slop, tolerate it, or even welcome it? Will we ultimately come to love slop?
IV. Wither Slop?
AI slop is not an aberration but an inevitable feature of how generative media works. Yet the current arrangements are both systemically and inherently unstable. The metabolic rift between humans and AI threatens multiple forms of failure: model collapse, ecological collapse, and cognitive collapse. The current approach to AI cannot sustain itself indefinitely. The question becomes not whether current AI slop economies will destroy themselves, but when.
This instability creates space for what we might call “post-synthetic” cultural organization: forms of culture operating under different metabolic logics, whether by design or necessity. Such alternatives would demand the fundamental transformation of existing technical and economic arrangements—alternative energy systems, cooperative ownership models, and cultural protocols that resist algorithmic ingestion. This requires not a nostalgic return to pre-digital cultural forms, but practices adequate to our ecological and social moment.
The question that remains is whether we will choose to embrace slop or cultivate desire for something else entirely: cultural forms moving beyond synthetic reproduction, oriented toward flourishing rather than extraction; toward ecological regeneration rather than a hyper-metabolic state of exhaustion and breakdown.
1. Matthew Gault and Emanuel Maiberg, “The AI Slop Fight Between Iran and Israel”, 404 Media, June 18, 2025, ➝.
2. Ali Khamenei (@khamenei_ir), “”Help from Allah and an imminent conquest” (Holy Quran: 61:13),” X, June 16, 2025, ➝.
3. Dana Nickel, “AI Slop spreads in Israel-Iran War”, Politico, June 23, 2025, ➝.
4. Jean Baudrillard, Simulacra and Simulation, trans. Sheila Faria Glaser (Ann Arbor: University of Michigan Press, 1994).
5. The traditional Krebs cycle occurs in most forms of life that use oxygen to generate energy, whereby the body breaks down nutrients like glucose in a series of chemical reactions. During this process, carbon dioxide is released as waste and energy is captured in molecules that cells use to power their activities.
6. Vilém Flusser, Into the Universe of Technical Images (Minneapolis: University of Minnesota Press, 2011); Harun Farocki, Eye/Machine I–III, 2001–03. Video installation trilogy.
7. See Kate Crawford in conversation with Paola Antonelli, “Metabolic Images,” Aperture, Winter 2024, ➝.
8. Karl Marx, Capital: A Critique of Political Economy, vol. 1, trans. Paul Reitter (London: Penguin Classics, 2019).
9. John Bellamy Foster, Marx’s Ecology: Materialism and Nature (New York: Monthly Review Press, 2000).
10. Jason W. Moore, “Environmental Crises and the Metabolic Rift in World-Historical Perspective,” Organization & Environment 13, no. 2 (June 2000): 123–57.
11. Sina Alemohammad, et al., “Self-Consuming Generative Models Go Mad,” arXiv preprint, 2023, ➝; Ilia Shumailov et al., “AI Models Collapse When Trained on Recursively Generated Data,” Nature 631 (2024): 755–759.
12.Todd Spangler, “Why Apple’s iPad Ad Fell Flat: Company Failed to Understand It Conjured Fears of ‘Tech Kind of Destroying Humanity’,” Variety, May 16, 2024, ➝.
13. Pablo Villalobos et. al. “Will We Run Out of Data? Limits of LLM Scaling Based on Human-Generated Data,” arXiv preprint, 2024, ➝.
14. Kyle Wiggers, “Elon Musk Agrees That We’ve Exhausted AI Training Data,” TechCrunch, Jan 8, 2025, ➝.
15. Jordan Meyer et. al, “Public Domain 12M: A Highly Aesthetic Image-Text Dataset with Novel Governance Mechanisms,” arXiv preprint, 2024 ➝.
16. “AI Is Set to Drive Surging Electricity Demand from Data Centres While Offering the Potential to Transform How the Energy Sector Works,” International Energy Agency, April 10, 2025, ➝.
17. Ian King, “AI Computing Is on Pace to Consume More Energy Than India, Arm Says,” Bloomberg, April 17, 2024, ➝.
18. Bracey Harris et al, “Up against Musk’s Colossus supercomputer, a Memphis neighborhood fights for clean air,” NBC News, May 15, 2025, ➝.
19. See Stargate Project announcement: “Trump announces ‘Stargate’ AI infrastructure project with ‘colossal data centers’,” posted January 21, 2025, by NBC News, YouTube, ➝; James Walker, “Opening the Stargate: Tech Giants to Invest $500B in AI Data Center Infrastructure,” Data Center Knowledge, January 22, 2025, ➝.
20. See more at this report: Tatiana Carayannis, Naima Kane, Marie-Therese Png, and Alondra Nelson, Summary Report: Workshop on the Geopolitics of Critical Minerals and the AI Supply Chain, (Science, Technology, and Social Values Lab, Institute for Advanced Study, 2025), ➝.
21. Matt Klein, “The Problematic Fakery Of Lil Miquela Explained—An Exploration Of Virtual Influencers and Realness,” Forbes, November 17, 2020, ➝.
22. Dexter Thomas, “Inside the Economy of AI Spammers Getting Rich By Exploiting Disasters and Misery,” 404 Media, January 22, 2025, ➝.

Leave a comment