Ghost stories for dark infrastructures

Peter Lemmens

1. frontend scenarios and backend scripts

“Likewise, the essence of technology is by no means anything technological”, Heidegger begins in The Question Concerning Technology. That was 1954. Technology has advanced considerably since then. However, we might still be able to discuss technology without losing ourselves in the technological.
Infrastructure is defined broadly as components of interrelated systems providing commodities and services essential to enable, sustain, or enhance societal living conditions and maintain the surrounding environment. Typically these are physical components, but they also includes various software and network components. That’s Wikipedia talking.

“Digital objects […] constitute a new form of industrial object that pervades every aspect of our lives in this time of ubiquitous media—such as online videos, images, text files, Facebook profiles, and invitations.”
— Yuk Hui, On The Existence Of Digital Objects

In a current Western information capitalism and distribution, information technology has become a significant part of that infrastructure. The application of computers to collect, process, store, retrieve, create, and distribute all kinds of data has become prevalent.

“Roughly 2.5 quintillion bytes of data are generated each day. There are approximately 44 zettabytes of data in the world in 2020. Given how much data is created every day, there will likely be 175 zettabytes by 2025. […] Stored data grows a 5 times faster than the world economy.”
— https://financesonline.com/how-much-data-is-created-every-day/

Nobody really knows what a zettabyte is. The word even seems especially invented to describe an indescribable quantity. It does sound like a staggering amount. What is all this data? A 2020 estimate informs us:

“- as of August 2020, in one internet minute there were 41,666,667 messages by WhatsApp users.
- email users sent 306.4 billion emails per day.
- a connected car produced 4 TB of data in one day.
- 28 PB is data generated from wearable devices.”
— https://financesonline.com/how-much-data-is-created-every-day/

However, the focus of the story now is not a data overload. It is an attempt at a simple categorization of all this data and an oblique, speculative look at the impact of such a categorization on public life. The first question here then is: can we differentiate the data generated by a connected car from messages sent by WhatsApp users? Is an email something separate from data collected by wearable devices? Does an online receipt or shopping cart function alike for the consumer as it does for the online platform? Does a running app route on a screen serve the same function as that exact same route running on the app’s servers?
There is the data we can see on a screen and then there is the data that slipstreams alongside it in a background through various infrastructures.

“By digital objects, I mean objects that take shape on a screen or hide in the back end of a computer program, composed of data and metadata regulated by structures or schemas. Metadata literally means data about data.”
— Yuk Hui, On The Existence Of Digital Objects

As infrastructure retreats more and more into a background and even the visible materiality of our connectedness disappears with the obsoleteness of cables, the scenarios that appear on a frontend are dissociated from the scripts that run in a backend. If we can agree that there is such a thing as visible frontend information and opaque backend information, then there are…

2. things to talk about in the dark

Darkness is a setting for stories. It actively prompts them. As darkness dissolves matter into obscurity, it is in fact a narrative generator. It immaterializes all into narratives, simply because there is nothing to see. In darkness narratives are given free rein.
There are always two sides to every story, a saying says.
On the frontend, let’s argue here that when infrastructure (i.e. technology) enables, sustains, or enhances societal living conditions and maintains the surrounding environment, often it is not really about the information that flows at all. What could be important to understand here is that there is only the abstract idea of information. Everything is there to suggest it. There is a riveting title, a capturing image, a clickable link. However, if with information we are referring to that visible content, information is simply replaced over and over again. There is only an infinite scroll of placeholder stories lighting up pixels on a screen. This frontend just needs to trigger and push along, not inform and linger. What we have here is a question of volume over quality. The infinite scroll and its narratives only instill anticipation for the real estate just beyond the edge of the screen. It is literally right there at your fingertips. Nudging you to click and move on, the increments of investing are small, but quickly add up. This moment of anticipation is consolidated into a state of perpetual beta.
We could make a claim that this actually renders the frontend an information poor environment, even if it very much looks like the opposite.

“But the entirety of this archive is not adapted to human perception, or at least not to individual perception. Like all large-scale databases —including WikiLeaks’ Syria files— it takes the form of a trove of information without (or with very little) narrative, substantiation, or interpretation. It may be partly visible to the public, but not necessarily entirely intelligible. It remains partly inaccessible, not by means of exclusion, but because it overwhelms the perceptual capacity and attention span of any single individual.”
— Hito Steyerl, Duty-Free Art

On the backend side, it is a different story. Here narratives adhere to a language of their own, a syntax and semantics that is below the spectrum: code, script, tags, string, conversion, click map, cross site tracking, cookies, ... Although they might resemble English, they actually are not. Here it is the illegible meta-data that flows. This is the actual information piggybacking on the inconsequential frontend content.

“With some exceptions, we found that companies largely failed to disclose sufficient information about these processes, leaving us all in the dark about the forces that shape our information environments. […] While we know that algorithms are often the underlying cause of virality, we don’t know much more —corporate norms fiercely protect these technologies as private property, leaving them insulated from public scrutiny, despite their immeasurable impact on public life.”
— https://www.newamerica.org/oti/reports/its-not-just-content-its-business-model/algorithmic-transparency-peeking-into-the-black-box

Information has moved to the structures operating in backend, to infrastructural darkness. No, this is not the purposefully hidden frontend content tucked away on the Dark Web. It is the vast array of hidden, interconnected, interacting infrastructures, ultimately inaccessible no matter what VPN or Tor Browser you use.

“[…] predictive and machine learning approaches arise from a constitutive opacity present in all platform ensembles. It suggests that the growth in predictive programmability can be understood in terms of an increasingly experimental interplay between processes of platformisation and infrastructuralisation.”
— Adrian Mackenzie, From API To AI: Platforms And Their Opacities, Information, Communication & Society

We could make a claim that this backend is the information rich environment. Distribution is the name of the game here. Information finds its leading edge whenever it is put into circulation, accelerating through these infrastructures.
“Human agency and experience lose their primacy in the complexity and scale of social organization today. The leading actors are instead complex systems, infrastructures and networks […]”
— Armen Avanessian & Suhail Malik, The Time Complex: Post-contemporary

And circulation and acceleration need scale. A scale dependent on backend processes.

“Scale matters—the societal impact of a single message or video rises exponentially when a powerful algorithm is driving its distribution.”
— https://www.newamerica.org/oti/reports/its-not-just-content-its-business-model/russian-interference-radicalization-and-dishonest-ads-what-makes-them-so-powerful

This could characterize the difference between frontend and backend as…

3. party in the front and business in the back

Data alone is not enough. It needs to be narrativized and it is in backend circulation data becomes narrative.

“Schemas are structures that give semantic and functional meaning to the metadata; in computation, they are also called ontologies.”
— Yuk Hui, On The Existence Of Digital Objects

Colorless green ideas sleep furiously. Sorry, that last sentence — taken from Chomsky — didn’t make much sense. Meaning:

“There is no obvious understandable meaning that can be derived from this sentence, which demonstrates the distinction between syntax and semantics, and the idea that a syntactically well-formed sentence is not guaranteed to be semantically well-formed as well. As an example of a category mistake, it was used to show the inadequacy of certain probabilistic models of grammar, and the need for more structured models.”
— https://en.wikipedia.org/wiki/Colorless_green_ideas_sleep_furiously

Similarly, data needs more structured models as well. This is touched upon with the Semantic Web. Its goal is to make Internet data machine-readable. Or as the World Wide Web Consortium (W3C) defines it in its FAQ:

“The Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries.“
— https://www.w3.org/RDF/FAQ

That same consortium, in that same FAQ, also asks the question: “Will I “see” the Semantic Web in my everyday browser?” The answer is quite clear:

“Not necessarily, at least not directly. The Semantic Web technologies may act behind the scenes, resulting in a better user experience, rather than directly influencing the “look” on the browser. This is already happening: there are websites that use Semantic Web technologies in the background.”
— https://www.w3.org/RDF/FAQ

Today, Semantic Web technologies seem to have lost some of their momentum: too complex, made by academics for academics, not very accommodating to global and industrial scales. Yet, the idea behind it endures, maybe even flourishes. A.I. and Machine Learning are now capable of doing what the Semantic Web originally set out to do and beyond.

“[…] the Semantic Web is transitioning, rather than dying, but the reality is that A.I. and Machine Learning are outpacing it at a significant rate. […] The programming languages we use now are able to cut through the complexities of web data so that any site – regardless of size or number of HTML documents – can use data to grow. In other words, the death of the Semantic Web is a very, very good thing for business.”
— https://blog.diffbot.com/rip-the-semantic-web/

Not only is the data machine-readable, data is embedded in the infrastructure. As these infrastructures continuously read, discover, forge, but also write, rewrite, distribute and mine data with relations, these structured models or narratives never leave the infrastructure. Therefor, narrativity here doesn’t mean coinciding with words but it expands into the intangible constructions of production and distribution. It is:

“the architecture of information and the socio-economic infrastructures and conditions of display […]”
— Brenda Tempelaar, 2020

Technologies not only have learned our language to:

“[…] inform themselves and each other […] ”
— Juho Rantala, The Notion Of Information In Early Cybernetics And In Gilbert Simondon’s Philosophy

They are also an infrastructures of machines, by machines, for machines.

“To some extent, it would be valid to say that content is not a key issue for a digital object; what really matters are relations. Throughout the web of digital objects, it is simultaneously a web of relations.”
— Yuk Hui, On The Existence Of Digital Objects

The frontend narrative is one of perpetual beta, an infinite scroll of replaceable fun-sized clickbait. This is the party. While the frontend becomes more and more superfluous and of no consequence, the backend narratives are moved into an obfuscated background and take real value along with them. This value is defined by its derivative character: a bait-and-switch configuration for extraction. This is the business model.
The reward no longer lies in the thing itself. This infinite frontend scroll generates a mass of derivatives and spinoff narratives. Continuously extracted from that stream, meaning and value is shifted to the backend, where it becomes inaccessible to frontend scrollers. It is systems communicating with systems communicating with systems that present inconsequential, replaceable frontend narratives.
Data is everywhere, but where it is infused with relations in a background infrastructure, it is narrativized. Or to rephrase this: when does data, infrastructure and distribution loose objectivity? When they become narrative. This might mean infrastructure is…

4. neutral schmeutral

This is where the underlying infrastructures become key, because they implement features of their own, distribute narratives of their own.

“[A.I.] still claims to know our secrets, but now it tells us that it cannot make heads or tails of them, and that an inability to reckon with these truths is a safety feature of intelligence “because our minds would break under the strain of knowing such things about ourselves.” In line with its showboating tendencies, the A.I. claims that its secrets are the very secrets of the universe, and that these secrets are still hidden. In other words, even if the A.I. has “seen it all, heard it all, recorded it all, stored it all, used it all, analyzed it all,” it still cannot understand any of it. Thus, what is revealed is the rather mundane secret that interpretation is required.”
— Ethan Plaue, William Morgan and GPT-3, Secrets And Machines: A Conversation With GPT-3

This could challenge the idea that:

“[…] infrastructure stands for anything that is understood in and of itself, before a thing finds expression within the fields of structure and superstructure, before, for example, it manifests itself either as economical phenomenon or a political one.”
— David Kishik, The Manhattan Project: A Theory Of A City

The input is no longer the same as what comes out. Technology is not simply a storage device for data that an end user can retrieve and manipulate, but a transforming infrastructure that actually shapes and reshapes whatever it stores and distributes.

“There was always more information going into the machine than coming out.”
— Juho Rantala, The Notion Of Information In Early Cybernetics And In Gilbert Simondon’s Philosophy

Input is collected, managed and bounced back to the screens. In the meantime it is processed and divergent data is created and distributed back through the infrastructure.

“So, it’s not only that technology narrates our society; technology itself is already the outcome of complex social processes and practices. It is in itself a socially constructed narrative. There is a reciprocal relationship in which society and technology co-construct each other; they are each other’s ongoing narrative.”
— Lucas Introna

The frontend narrative and backend narrative are interconnected and we are presented with a collection of…

5. ghost stories

In short there is a splitting of narratives. However, they are not completely disconnected. The narrative is not duplicated, but plicated and folded over itself. It severs the frontend from the backend only to recouple them again in a different manner. It is the image and the afterimage. A visible frontend serves to hide a backend narrative. This means that the latter narrative is obfuscated by the first. As long as users keep scrolling through the infinite feed of clickbait, the backend can do its thing.

“Luhmann’s distinction is important to consider in a network context: whenever people are collaborating on a project of knowledge sharing or creative production, they are collaborating not only with other people, but with a system which they, the other participants, and the communicative environment help to create. In networked computer environments in particular, collaboration is always both collaboration with other people and with systems. Processes are co-creators of collective knowledge.”
— Scott Rettberg, All Together Now: Collective Knowledge, Collective Narratives, And Architectures Of Participation

In doing so a double ghost story appears.

“A ghost story is any piece of fiction, that includes a ghost, or simply takes as a premise the possibility of ghosts or characters' belief in them. […] Ghosts often appear in the narrative as sentinels or prophets of things to come.”
— Wikipedia

As frontend narratives become more and more replaceable and inconsequential, it insists on a belief in the frontend scenarios, even if these scenarios are initially met with skepticism. They functions like Ponzi schemes, where you always need greater fools to believe. In a bait and switch model, the feed keeps going. The ghost of interaction walks these information corridors. The narrative is immaterial, it carries no weight and is unfazed by matter. Reach out to touch it and it disappears. It dissolves into thin air, replaced by a new apparition. It manifests something that is already dead.
And as with almost any ghost story, the narrative is generic. We already know it, it repeats what we want to hear. No story more generic than the ghost story. The real drive behind it lies somewhere else. We are captivated by something beyond the narrative. Something like a certain visual or a jump scare. The ghost story is just a way to keep fuelling the anticipation for that something else.
On the backend side, it is the story that is being told in the dark. And darkness breeds ghosts. Operating in that darkness, this narrative can’t be seen and can’t be touched, it is immaterial and moves through walls other things can’t go through: brick walls, pay walls, firewalls,…

“The Semantic Web provides a common framework that allows data to be shared and reused across application, enterprise, and community boundaries.”
— https://www.w3.org/RDF/FAQ

What this leaves behind is some derivative residue, a seldom seen ectoplasm. This is the extraction in the backend.
Technology interacts with an audience that has very limited resources. Attention and time are capped capacities, limited currencies. In order to be clickbait, the frontend narrative is lucid, clear cut, generic and evermore persuasive and sensational. More and more, narratives are overhauled as unilateral constructs for leverage that require neither dialogue nor reflection.

“Key to these arguments is the assertion that value itself is now ‘linguistic’: drawing on Saussure’s homology between the differential logic of price and the differential logic of sign systems, scholars like Jean-Joseph Goux describe a new ‘regime of value’ that is, like language, ‘arbitrary, differential and aleatory’”
— Annie McClanahan, Investing In The Future

Max Haiven writes in 2014:

”Finance is a social fiction whose power depends on and drives the proliferation of social fictions throughout financialized societies.”
— https://maxhaiven.com/cultures-of-financialization/#TOC

The backend has every incentive to use opaque language. Its resources are on the edge of unlimited. Which apps are communicating at the speed of light with which websites, how often and why, is intentionally unclear. Technology might be capable of changing how we experience the world, but it is mostly left unaccountable.

“[content-shaping algorithms and content moderation algorithms] are extraordinarily opaque, and thus unaccountable. Companies can and do change their algorithms anytime they want, without any legal obligation to notify the public.”
— https://www.newamerica.org/oti/reports/its-not-just-content-its-business-model/a-tale-of-two-algorithms

This backend language addresses other algorithms and other infrastructures, that need no persuasion. The goal here is not to listen in and predict what you might need and then provide that. However perverted that might be, it would serve a logical goal. Yet, the backend narrative is way too jumbled for that. The goal is not to create a comprehensible narrative, directed by the few. It is to average out behaviours into extractable sets and subsets of profiles and probabilities. Whether these belong to individuals or companies does not matter. They are all frontend users. Narrativizing a backend means threading the needle of the algorithmic profiles that this information network is continuously fed with. Presenting this averaging out as something valuable within a probabilistic context, allows the extraction of continuously updated profiles. So, for this kind of extraction the backend narrative doesn’t need to be linear, clear or comprehensible at all.

“So, while finance might be a modality of social fiction which functions by telling a performative story about the world, it is a story that is, essentially, almost nonsense: a frenetic jumble of metaphors built on metaphors that never resolve into a coherent or linear narrative. Wealth is generated not by seeing the greater narrative in the market, but by spinning out new metaphors and abandoning them once they have done their work. The system is held together not by internal coherence, but by sheer momentum.”
— Max Haiven, Cultures Of Financialization

That backend needs nothing except data pouring in. In a sense, the backend injects its frontend users into its hardware. Serve the servers… Frontend users are not interacting, they are simply fine-tuning profiles and debugging an array of systems communicating with systems. This backend does not provide us with things, we are providing it with things.

“This is why mass data collection is so central to Big Tech’s business models: companies need to surveil internet users in order to make predictions about their future behavior.”
— https://www.newamerica.org/oti/reports/its-not-just-content-its-business-model/a-tale-of-two-algorithms

The backend darkness is a black hole that absorbs. More goes in than what comes out and what comes out is already transformed. More is extracted than returned. And this extraction needs the information your car generates, the data proliferating from your mobile device.
Maybe it is now time for …

6. some not so exemplary examples

August 12, 2020 — the day the UK’s national income declined by over 20 percent as the London Stock Exchange saw an increase of more than 2 percent — was the symbolic moment of the decoupling of finance and the real economy according to Yanis Varoufakis.

“Financial capitalism has decoupled from the capitalist economy, skyrocketing out of Earth's orbit, leaving behind it broken lives & dreams. As the UK sinks into the worst recession ever, & US edges toward failed state status, FTSE100 goes up 2% & S&P500 breaks all time record!”
— Yanis Varoufakis, (@yanisvaroufakis) August 12, 2020

In an example of ECB, Deutsche Bank and Volkswagen, Varoufakis shows how financial capitalism has decoupled from the capitalist economy. The frontend narrative here is that money is being pumped into companies like Volkswagen because there is a severe recession going on. Here it is used, not to invest in production or labor because consumers can’t afford new cars during a recession. Instead, it is used to buy Volkswagen shares which in a backend narrative boosts share value and fictional profit.

“In today’s world, it would be a mistake to try to find any correlation between what is going on in the real world (of wages, profits, output and sales) and in the money markets. Today, there is no need for a correlation between ‘news’ (e.g. a newsflash that some large multinational fired tens of thousands) and share price hikes.
[…] It was in the summer of 2020 when financial capitalism finally broke with the world of real people, including capitalists antiquated enough to try to profit from producing goods and services.”
— Yanis Varoufakis, Something Remarkable Just Happened This August: How The Pandemic Has Sped Up The Passage To Post-capitalism.

Along a more speculative line, we can follow Molly White when she talks about greater fools in the Crypto context.

“Crypto, when it comes down to it, relies on greater fools. As assets without any intrinsic value, the way to make money from crypto is to find a greater fool who will buy your assets from you at a higher price. […] When building a community is often a coverup of Ponzi schemes.”
— Molly White, https://blog.mollywhite.net/predatory-community/

She notes that:

“Crypto projects in general depend on people believing their tokens have value, and community behaviors that reinforce this are essential.”
— Molly White, https://blog.mollywhite.net/predatory-community/

And this is where frontend narrativity being used for backend extraction comes into play:

“It’s one thing to sell off some Bitcoin if your “investment” isn’t going as well as you hoped, or conversely, if you want to take some profits and rebalance a portfolio. It’s another thing entirely to do that when your asset is what makes you a part of a “World of Women” community, and where selling may feel like a betrayal to the people you’ve formed relationships with or to the ideological causes the group supports.”
— Molly White, https://blog.mollywhite.net/predatory-community/

Another, more debatable example could be art.

What is the backend narrative in art? We all know the frontend narrative: creativity, artistic, change, collaboration, beauty, etc… This could be a useful example of party in the front. Openings with wine and champagne, travelling and art tourism, beautiful invitations, city development, creative industries, … For the backend we can revert back to Hito Steyerl on freeports:

“To paraphrase philosopher Peter Osborne’s illuminating insights on this topic: contemporary art shows us the lack of a (global) time and space. Moreover, it projects a fictional unity onto a variety of different ideas of time and space, thus providing a common surface where there is none.Contemporary art thus becomes a proxy for the global commons, for the lack of any common ground, temporality, or space.
It is defined by a proliferation of locations, and a lack of accountability. It works by way of major real estate operations transforming cities worldwide as they reorganize urban space. It is even a space of civil wars that trigger art market booms a decade or so later through the redistribution of wealth by warfare. It takes place on servers and by means of fiber optic infrastructure, and whenever public debt miraculously transforms into private wealth. Contemporary art happens when taxpayers are deluded into believing they are bailing out other sovereign states when in fact they are subsidizing international banks that thus get compensated for pushing high-risk debt onto vulnerable nations. Or when this or that regime decides it needs the PR equivalent of a nip and tuck procedure.”
— Hito Steyerl, Duty-Free Art

This raises the question…

7. is art complicit (yet again)?

Yes. However, we can be cynical about it and say that: yes, art has paved a route here, and yes, it is complicit, and yes, it has given prototypes of this uncoupling of frontend and backend, and yes, it proposed narratives as a dominant currency only exchangeable for other narratives. Sparkling invitations, shiny images, reviews of temporary and replaceable shows that can serve as frontend to freeports, free labor, etc,…

But let’s not go down that route as cynicism only has a very limited scope. Let’s look at art through what it also does, what else there is alongside it. If it is business in the back, can the front get back in business? Are we late to the party or do we change business hours?
Let’s end…

8. a long story to tell a simple thing

Everybody loves a good story.
A narrative turn has moved the leading edge of capitalism from material production to immaterial circulation. This transforms a society governed by the physically determined constraints of underlying production into one superseded by the unlimited double dynamics of circulating narratives. Things used to get spatially displaced. But in narrative distribution not all information is parsed to the visible in real-time. Things light up on screens, but things also remain in code within obscure infrastructures. Unrendered, these things slipstream in the wake of a frontend as they distribute narratives of their own in the backend for derivative extraction. This might mean that value is no longer situated in the information, but in the narrative and its capacity for infrastructural distribution. The spinoffs are the real narratives. These backend narratives are piggybacking, they are invisibly tagging along, like an invisible man sleeping in your bed.

“If there's something weird
And it don't look good
Who you gonna call?

An invisible man
Sleepin' in your bed
Ow, who you gonna call?”
— Ray Parker Jr, Ghostbusters

To get a grasp on these backend narratives, we can ask things from infrastructure.

“Companies should enable users to decide whether to allow these algorithms to shape their online experience, and to change the variables that influence them. […] Companies should explain how such algorithmic systems work, including what they optimize for and the variables they take into account.”
— https://www.newamerica.org/oti/reports/its-not-just-content-its-business-model/key-transparency-recommendations-for-content-shaping-and-moderation

We can ask kindly, we can demand, we can ask again… But bringing a backend to the frontend within a choice architecture written on the same backend principles might just end up to be a technicality. Instead we can also insert frontend moments and spaces of darkness in the backend. Moments of the unrecognizable, the inextractable, the unknown, the unnarrativizable —to make up a word. The infrastructural narratives are inductive systems operating on averaging out. In order to not simply train or debug a backend system, can we inject an other kind of darkness instead of bringing things into the light. A type of darkness that cannot be extracted by the infrastructural. A darkness that isn’t synonymous with secrecy and the probable, but with the unknown and potentiality. A darkness, not of undisclosed knowledge, but a darkness of the acknowledged unknown. A darkness that is not known by those who operate in that darkness, but a darkness that sparks imaginative narratives beyond the known. It’s completely reasonable to think that in order to discover new knowledge we must partly abandon our current knowledge and what our current knowledge says can not exist. Even the infrastructural dark could be afraid of such a darkness.

“Deleuze cites Francis Bacon: we’re after an artwork that produces an effect on the nervous system, not on the brain. What he means by this figure of speech is that in an art encounter we are forced to experience the “being of the sensible.” We get something that we cannot re-cognize, something that is “imperceptible”—it doesn’t fit the hylomorphic production model of perception in which sense data, the “matter” or hyle of sensation, is ordered by submission to conceptual form. Art however cannot be re-cognized, but can only be sensed; in other words, art splits perceptual processing, forbidding the move to conceptual ordering.”
— https://plato.stanford.edu/entries/deleuze/

This can also bypass the somewhat tedious question of what is art with the question where is art. It can redefine art as a locus where things can circulate that might not necessarily have another space. A site where something and someone can talk that might not have a spotlight yet or might not even necessarily look for a spotlight. It can operate from a dark spot, whether this is on a fringe or in a center. Art not as site specific but as a specific site. Art can be questioned as a site of interference for dominant narratives instead of the search for yet another alternative dominant narrative. Art can operate in the darkness of the unrecognizable and create dark spinoffs that stop the infinite scroll.

“[…] by ‘construction’ what is conveyed, is that opportunity is not something that can simply be revealed, it is not subject to unveiling, nor is it a ready-at-hand prefigured condition. Opportunities are not self-evident pathways suddenly appearing from nowhere to be passively or patiently hoped for, they require enabling conditions. The space of possibility implied by ‘opportunity’ requires fabrication, and this task is both conceptual (possibilities need to be made intelligible, or available to thought), as well as material (possibilities need to be realizable at the level of practice).”
— Patricia Reed, The Toxicity Of Continuity

The proposal here could be: not to find new narratives to believe in. We might not need new narratives. Let’s not call out for new ghost stories. Maybe we can step away from content-providing new narratives and, in doing so, co-construct an infrastructure that allows narratives to shift meaningfully? It can be a way to break the mold in which these narratives are created. The proposal could to find modes of denarrativizing what is presented and distributed in these infrastructures. Not to replace narratives with the narrative of bare facts, but with a critical modes of literacy towards our own narrative fallacies, to alter momentum, alter distribution, defy the average, dismantle narratives. It could be a way to address an essence of technology and redefine its backend darkness. It could also be simply learning how to read in the dark.