As an overweight 50-year-old man, when I decided to take up running late last year, I had to take steps to prepare. I bought a new set of well-cushioned trainers. I saw my doctor, vaguely hoping that she would warn me off my folly. (Alas, she was enthusiastic and supportive.) And, critically, I joined an online community, the Slow AF Run Club.
Slow AF is the brainchild of Martinus Evans, a marathon runner who began the sport out of spite. A doctor told Martinus that he had to reduce his bodyweight—then 163kg—or he would die. When Martinus said that he planned to take up running, his doctor laughed and told him that he would die if he tried. Martinus ignored the dismissive advice and completed a marathon about a year and a half later.
Eight marathons later, Martinus is a professional running coach, a podcaster (300 Pounds and Running) and the founder of an online community for people who run really slowly. And he’s an exemplar of one of the most interesting—and complicated—trends in social media: the rise of the small room.
Over two decades, social media has become a powerful and much discussed force. Networks like Facebook and LinkedIn keep us connected to school friends and former colleagues, and to their sometimes problematic political and scientific beliefs. Instagram and X give us glimpses into the lives of people that we admire, as well as the lives of those that we loathe. A common ground between these disparate platforms is their need to serve audiences in the hundreds of millions or billions, in order to support a revenue model in which services are free and supported by advertising. They are, necessarily, big rooms that welcome everyone and offer few indicators of what the room should be used for.
Dissatisfaction with these large rooms is mounting. Legislators in both the US and Europe worry that platforms are too permissive of speech that harms young people, particularly young women. Studies link a rise in teen anxiety and depression to content that promotes an unhealthy body image, and to the phenomenon of social comparison, the endless contrast of our mundane existence with the sun-dappled highlights of our friends’ carefully curated Instagram images. And electoral bodies worry that the more than 80 elections taking place around the globe in 2024 will be influenced by mis- and disinformation disseminated across social media spaces with broad reach.
By contrast, small rooms like Slow AF offer a compelling alternative. Rather than opening their doors to everyone and seeking an audience of billions, small rooms fulfil a specific need for a specific group. There’s no need to join Slow AF if you’re a competitive runner—you’ll find hundreds of communities better suited to your needs. But serving a particular population means Martinus and colleagues can build a set of rules, norms and tools that suit their community. You cannot simply join Slow AF; you apply, writing a series of short essays to explain your goals as a runner and the challenges you face. You agree to a code of conduct that, among other things, prohibits discussion of bodyweight or weight loss. And you quickly acclimatise to a set of social norms in which people offer affirmation and encouragement to people they’ve never met before.
Very different conversations unfold in large versus small rooms. If I wanted to share the news that I’ve completed my first uninterrupted one-mile run in 13 minutes on X, I’d be likely to receive a mix of congratulations, scorn from the six-minute mile crowd, well-meaning warnings about the damage that I am doing to my knees as a large man running and screeds about the “dangers” of fat acceptance. On Slow AF, I would be more likely to receive encouragement and sincere congratulations on my milestone.
Of course, not all the information in these small rooms is trustworthy. Suspect medical advice, like the idea of using a drug called ivermectin to combat Covid-19, spread in small rooms before reaching the mainstream. Importantly, there is not always a clean line between big and small rooms; Facebook, as a whole, works as a big room, but it contains many, many small rooms, in the form of groups and chats where self-selecting communities discuss the issues that animate them most.
It’s hard to know how to govern small rooms such as a Facebook group focused on “holistic health”. Are these public forums, where it makes sense to flag and challenge misinformation, or private spaces where the likeminded should be able to gather in peace? While it’s easy to criticise platforms for allowing such misinformation to proliferate—as Joe Biden memorably did when he announced that Facebook and other companies were “killing people”—what if the case is not as clear as the problem of Covid misinformation, but concerns disputed topics like healthcare for trans youth? Should platforms weigh in with fact-checks on the potential dangers of hormone therapy or gender-affirming surgery in spaces where trans people are seeking support?
In 2019, Facebook signalled that it wanted to get out of adjudicating on such complex issues by promoting private spaces based on the design of WhatsApp, the company’s encrypted messaging system. By promising users that their messages were private, readable only by their recipients, they countered public pressure to remove misinformation from these small rooms. It’s technically impossible for Facebook to read and moderate private WhatsApp messages or communiques on Facebook Messenger unless one of the recipients reports a message as abusive.
In part due to privacy and lack of moderation, WhatsApp functions today as one of the most popular platforms for friends and family who stay in touch. But it’s also a dangerous vector for political and health misinformation. Researchers have identified WhatsApp as a popular channel for political propaganda spread by Bolsonaro supporters in Brazil. Because these spaces are encrypted, researchers and journalists gain visibility only when people in these small rooms leak information—the dissident Lula fan in a family channel of Bolsonaro supporters, for example.
A Brazilian factchecking organisation, Aos Fatos, has made electoral disinformation in small social spaces, like WhatsApp groups and Telegram channels, its key battleground. Its weapon of choice is Fátima, a chatbot designed to debunk disinformation conversationally. It uses a human-vetted database and the generative artificial intelligence of ChatGPT4 to offer conversational-sounding factual responses for users to post in a family WhatsApp thread when they believe disinformation is spreading.
The understandable fear about small online spaces is that they may allow disinformation and extremism to grow in the dark, without the oversight of the big tech companies. We know that some small networks permitted extremists to organise and plan the 6th January riots in Washington DC. Efforts like Aos Fatos show that there may be ways to mitigate the harmful effects of small rooms online while embracing the positive: a social network that can operate with different rules for different communities with different needs.