Spirits of the Dark Forest
“Retreating” into the “Dark Forest”? This popular narrative feels like giving up. This commentary piece challenges that defeatist take and the simplistic public/private model. Instead of hiding from platform decay, let’s explore tech-enabled, selectively permeable spaces — better ways to control sharing and foster quality connections, not just concede ground.
Okay, full disclosure: I wrote these thoughts down in April 2024 after listening to a Dark Forest Collective roundtable, feeling a bit triggered but never actually sent or published them anywhere. Consider this a slightly delayed, slightly self-indulgent message finally emerging from my own cozy bubble a year later… oops.
Core Thesis
Core Thesis
Instead of retreating into isolated private spaces or accepting the degradation of the public web, we should focus on developing technology that enables selectively permeable communication. This requires moving beyond binary public/private models to create systems — potentially like person-centric recommendation algorithms — that give users granular control over sharing, optimizing for quality of engagement and serendipitous connection rather than platform metrics or static group boundaries.
Notes from the shadows
Notes from the shadows
Some random remarks from someone lurking in the shadows:
I have to admit that, at first, I was a bit confused by the way Yancey used the Dark Forest metaphor for something that sounded rather cozy to me. When I read the Three Body Problem, the metaphor conjured rather uneasy feelings. Now, when I listened in on the Dark Forest Collective roundtable earlier, I realized that I was still strangely taken aback by the sense of defeat that comes to me with the notion of “retreating” to private spaces — and the perceived acceptance when listening to most people talk about it. It lingered after the call and something didn’t sit right with me, so I sat down and tried to sort it out for myself.
Condemnation Disclaimer first… Yes, I despise the slot machines that social media platforms have become. I hate how they’ve decayed my attention span.
Yet, when I find myself doomscrolling, my addicted brain can’t help but signal love for that very same thing. We’ve all come to accept the mute scream of cognitive dissonance coming from the subliminal. It’s almost too basic to even talk about it anymore.
Approaching the problems we face with this mindset is entirely backwards. It’s narcissistic nostalgia. It feels like conceding yet another battle.
So yes, from that perspective — I get it. I understand the desire to retreat, the frustration, the romantic, Luddite appeal of disconnecting from technology. But I also fucking hate myself for understanding it because it’s so wrong. Not because I don’t believe in the absolute necessity for private virtual spaces or because I don’t understand the appeal of romantic offline activities like writing postcards to friends. Rather, it’s because approaching the problems we face with this mindset is entirely backwards. It’s narcissistic nostalgia. It feels like conceding yet another battle.
To extrapolate the Dark Forest analogy a bit further:
Imagine a world where the platform bots ultimately prevail — the clear web continues its monetized descent into polarization, fragmentation, and pollution. No one feels compelled to “put themselves out there” anymore. Public opinion lacks all “public” because there is only algorithmically generated and optimized opinion left. The clear web as we know it becomes the de facto Dark Forest. All content is weaponized but impersonal, because nobody is left to participate. A perpetual shit engine where everyone has given up.
We protocolize and moderate and gatekeep so effectively that we end up in local minima… Communities continue to fracture into ever-smaller and more tightly-filtered bubbles. Consensus reality collapses because there is no consensus needed anymore. It’s peaceful, cozy, frictionless… and utterly boring…
So we retreat to our gated communities and defend them as good as we can. Tribalization maxxing. We protocolize and moderate and gatekeep so effectively that we end up in local minima with opinion bearers below the Dunbar number and communities of questionable diversity. Communities continue to fracture into ever-smaller and more tightly-filtered bubbles. Consensus reality collapses because there is no consensus needed anymore. It’s peaceful, cozy, frictionless, devoid of any extrapersonal meaning and utterly boring and detached from the complexity and absurdity of the real world.
I also get this. It’s a likely trajectory and the socially anxious part of me even finds it appealing. But then again, the scarcity of alternatives to retreat makes me uneasy.
I think a big obstacle lies in the construct of a public/private binary, as also identified in the Moving Castles essay.
When it comes to platform capitalism and its ownership of recommendation algorithms, the current sentiment (and the one that I read in most of the essays gathered in the book) is one of defeat, leading to evasion and abandonment. Yes, we try to game the algorithms to work in our favor, circumvent them, use camouflage or obfuscation to hide from them. But there haven’t been many efforts to gain far-reaching personal agency over “the algorithms (tm)".
In regards to information retrieval and consumption (aka controlling write access to our brains), we see that efforts are being made to protect our eyeballs from the accelerating AI-driven shittyfication of content. It’s the most immediate pain we feel, the burning, red eyed sensation of information overload and subsequent decision paralysis. So quite naturally, the development of curatorial systems (in any mixed modality of personal, collective, and AI assisted) that help us filter, is something that gets tackled first.
But the area that desperately needs improvement is the other direction of communication. Establishing more granular control over how and with whom our personal thoughts are shared by giving us agency over the dissemination of our personal thoughts. Elective muteness instead of infinite reach.
We strive for connections with “like-minded” and “friendly” people.
Currently we determine the potential like-mindedness of another through a combination of metrics: publicly surfaceable artifacts of the online persona as well as social reputation and recommendation. But in an ultimate Dark Forest scenario where no one dares or cares to express themselves publicly any longer, there is no easily accessible provenance of discourse, no trail of public thought to determine like-mindedness. So we would inevitably have to fall back to social mechanisms for gatekeeping our little closed communities.
But hey, sometimes the lurkers come to have a voice if you trigger them just right, like I just made a decision to leave my cozy bubble and connect with yours. I find it incredibly desirable to get specific input from someone I don’t know or have any connection to if I can trust that the input is worthwhile. Or incredibly sad to completely give up on this serendipitous opportunity for exchange by locking myself into statically encapsulated private spaces.
What I want is quasi private virtual communication that is selectively permeable. Channels that unfold and envelop certain people in one moment and then fold back into a tight knit sphere for the next message.
So one way forward is to focus more efforts on developing systems that enable us to create fluid personal/public boundaries that control “read access to our brains”. Perhaps I want to share a thought with a certain “type” of person because I believe their commentary will be valuable, but I don’t want to carelessly flag it as public and have the bots and haters ruin my day. Nor do I have any means to classify what exactly that desired “type” is without resorting to restrictive (and exclusive) labeling. At the same time, I don’t want to retreat to the echo chambers of my personal Cozy Webs.
What I want is quasi private virtual communication that is selectively permeable. Channels that unfold and envelop certain people in one moment and then fold back into a tight knit sphere for the next message. Channels that are high-dimensional and work without statically differentiating between in- and outgroup. In other words: recommendation algorithms that are not optimized for the highest engagement for the sake of the platform, but that are optimized for the highest quality of engagement for the sake of ourselves.
In order to build this I wholeheartedly believe that we NEED technology. I’m not a blind techno-optimist, I just don’t see another way. I disagree with the notion that “non-optimized, non-indexed” environments are a way forward. They are a step back and could eventually cripple us. I don’t want to manually adjust my “close friends” setting every time I post something. Or have to spread it on a few different discord servers that align with my personal need for coziness on that given day. It’s just so much work with too much residual uncertainty. So yeah, working on personally owned and controlled privacy control recommendation systems, designed with care and the right motives, could open up a lot of opportunities to fix what went wrong with the web. I believe working towards these kinds of person-centric, consent-driven solutions is the only way we can have an online public commons that doesn’t devolve. It’s just too late to go back to postcards.
I prefer little forest spirits — systems that compare other people’s information consumption and sharing patterns against my own to make informed decisions about whether to pass my thoughts along to that person or hide them in the darkest shadows. The alternative is to keep ceding ground until we’re all isolated in our dark forests, stewing in our own comfortable (dis)content.