This was originally posted on my Cohost account; I've reposted it here for posterity.
LARPing a Gossip Protocol about Reality
I've edited and re-edited this and I don't feel super comfortable sharing it but I need it out of my head sooner rather than later. I need that brain space back. And besides, I want to move on to post things that aren't about other social network platforms on this social network platform. So. Here goes.
Twitter changed how I write, for certain. Twitter also changed how I think.
I spent more than a decade squeezing every stray thought into Twitter. Every evening, morning, and spare moment scrolling to see what folks were talking about. Minutes in the shower thinking about my position on the topic of the day, then how best to cram that into 140, then 280 characters. "Hold on a moment," I'd say to myself mid-anything-at-all-really, "Has anyone made the 'Node dot michael jackson' joke yet?" 1
For a decade we talked to each other in tabloid headlines. It was a place to make dumb jokes, then watch likes pour in. To make even dumber jokes, and then watch nothing at all happen. To collectively gawk at the news of the day. To stay up late at night in fear of a war with North Korea. Twitter was the place I went to feel social without having to actually be social.
Twitter made me realize that hundreds if not thousands of people would, given the same relative prompts, produce the same dumb joke. Myself included.
You know what was new to me in the early-to-mid 00's?
The idea that I could quantify the approval of my peers. Used to be, you'd write something, put it out on the internet, and you might never even know if anyone cared about what you made. Sure, you could stare at Google Analytics, but all that told you was that "they" saw it. And who are "they", anyway? I had no way to know. You might get a comment or two on a link aggregator site, but largely... nothing.
I graduated into the 2008 recession. At that same time, I lost my first programming job. I was unemployed for the first time in my life (for what would be, to date, the longest stretch of time in my life.)
I was scared. I watched the savings I had put together to move to Portland dwindle perilously close to empty. This stretch of unemployment was a formative experience (even despite the number of safety nets available to me!)
The ability to quantify approval came to serve a completely new purpose for me at this point, one apart from feeding an annoyingly deep-seated personal need for external validation. I began thinking that "the more people know and approve of my work online, the less likely I am to be unemployed in the future." This wasn't a unique thought, really, lots of other people who had similar context came to the same conclusion. But Twitter hadn't risen to popularity yet, so how was I to know?
I resolved to try to do a lot more of my work in public, and in particular, to make things for platforms where I could measure the return on my effort. This former was a conscious decision, the latter, less so.
At the time, it generally seemed like the things I wanted to do anyway were the sorts of things that were getting approval online. I wasn't concerned with the thought that I would end up tailoring my creative output to fit any particular platform. The internet was still pretty weird in the mid-to-late 00's! (It continues to be weird now, but, like, in the sense that you might see Arby's flirt with Steak'ums.)
When I joined Twitter in 2009-ish, it was the exact opposite of compelling: it felt like being able to group text people I could already group text. My digital social network perfectly overlapped my real, physical social network. It felt forced; a parody of a social network, good only for dodging phone carrier SMS fees.
More to the point, it felt like a fad. At one point, my boss at the two (2!) person local startup I had mercifully landed at introduced a twitter-alike microblog system to work -- so we could micro-post our current working status. Our desks literally faced each other in an office the size of a closet. Such was the power of the fad at the time: we dutifully microblogged status updates we could have literally just said out loud to each other, face to face. We didn't just work on the cutting edge -- we lived on the cutting edge. And it hurt.
We eventually ditched that microblog. There wasn't enough of a social network on that social network. And so it was with Twitter at the time.
The inflection point came shortly after I moved to Portland in 2012. I still remember: buoyed by a digital social network I had grown around Node.js via IRC, I started to weigh in on the then-nascent ECMAScript modules debate on Twitter. Not only that, I was weighing in on it from karaoke. Over a beer. And I was getting responses from the guy who designed JavaScript 2. And another guy, whose code I idolized. And he liked my tweets.
Reader. Let me tell you. Through a combination of this ill-conceived platform and the mundane ignorance of the privilege I benefit from as a straight white cis guy, I had gained entry to a conversation I had absolutely not paid the table stakes to participate in.
I was hooked.
Twitter was a free copy editor: the constraint of 140 characters encouraged pithiness, impact, and comedic pacing. I stop short of saying it made me a better, more thoughtful writer, but the medium certainly shaped the messages I wrote.
This helpful editing advice wasn't free, to be clear: Twitter got something out of it. The constraints made it easier for people to create content; they also put an upper limit on wordiness which made it easier to feel okay with posting something short. The medium itself helped create an environment where content flowed infinitely; it helped us build our own skinner box. More tweets, more news, more jokes, more opportunities for validation.
I (and others) often complained about conversations on Twitter losing "nuance." Why were we all so okay sacrificing nuance so readily? Well, nuance doesn't count for much if you don't feel like you can be heard in the first place, so.
To be honest, it felt good to tweet through it, sometimes. It was cathartic. But it also frequently felt less like interacting with other people and more like feeding a machine: a machine hungry for attention, eyeballs, tweets, retweets, stories, feelings, hot takes.
I often found myself editing my thoughts down to tweet size so I could convey them on Twitter. The medium became the measure of my thoughts. I'd find myself thinking something insipid about the tactical branding of men's soap while shopping at Target (now with Mil-Spec (TM) paracord soap holder) and think, "I'd like to tweet about that later."
When a measure becomes a target, it ceases to be a good measure.
I don't actually want to tweet about that later. I don't think I want to tweet anymore, at all. I'd like some of my brain back, please.
Four things.
One: Twitter's best feature was its biggest liability. Anyone can talk to anyone at any time about anything. Anything you say has the potential to reach audiences both friendly and unfriendly at any point over time. You are close to everyone at the same time that you are speaking to no one in particular. You get to choose the sound bite, but you can never give it context. You are LARPing a gossip protocol about reality. Most of the time this is fine. Some of the time this is fun.
Two: Some of the time, this is dangerous. There are people who know this and want to exploit split brain situations on purpose. Or create them. Who looked at a thousand people making the same joke and thought "okay, how do we control the punchline?"
Three: worryingly, there's some evidence that nonlocal interactions, rather than echo chambers, increase polarization and violent partisanship. Being able to quantify validation doesn't just apply to artists and writers, it applies to people who crave violence against "the other". It gives them a mechanism to receive encouragement and a built-in, captive audience of targets to lash out at.
Four: ad-funded social network platforms optimize for engagement. Content that prompts anger draws shares and eyeballs faster than anything else. Thus, platforms are incentivized to optimize for as little moderation as possible while still being able to sell ads. Meanwhile, trolls optimize for creating as much anger as they can while dodging the (scant) moderation in place.
Twitter doesn't bear the nickname "hellsite" for nothing. Over time, the format and the incentives of the platform make it a factory for fascism.
And if you couldn't help but insert "nothing wrong with me" after reading each number in the prior paragraphs, a la Drowning Pool's "Let the Bodies Hit the Floor": condolences, we think alike.
And yet. Something's got to give, now.
Twitter was a digital third place. I am sad at the propsect of seeing it fade into the digital entropy that consumed IRC, Digg, Xanga, Livejournal, Starsiege: Tribes, and Tripod before it.
At its height, I could go to Twitter to ask questions, look for jobs 3, goof around, expand my worldview, find out about trends in my profession, find interesting artists, find new indie games, and keep up to date with world news (through perspectives completely different than mine!) And, yanno, read dril tweets.
But it really wasn't healthy, at least not for me. And I don't know if I want to rush to replace it with something similar.
Thanks to Jeff Lembeck and Krysten for reviewing this
because ES modules were getting node support and they were going to change the file extension for those modules to ".mjs", get it, and oh my god what have I done with the last ten years of my free time
who would later be revealed to be a bit of a turd
isn't it strange how much better Twitter was at this than LinkedIn?