Illustration by Vincent Kilbride

Tech has an innate problem with bullshitters. But we don’t need to let them win

Chasing network effects, tech entrepreneurs have even more incentive to talk up their product than their counterparts in other industries. How do we tell the fact from the fiction?
January 25, 2023

“One of the most salient features of our culture is that there is so much bullshit.”

Thus begins a brilliant essay by philosopher Harry G Frankfurt, who turned his copious intellect towards the topic of bullshit in 1986, and later published the ideas as a short book, On Bullshit. For Frankfurt, bullshit is more complex than falsehood. The liar knows the things he is saying are untrue, while the bullshitter, bluffing persuasively, “does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.”

Nathan J Robinson, writing in Current Affairs, postulates that we live in the “Age of The Bullshitter”, noting the power of figures like Donald Trump, Elon Musk and former crypto billionaire and accused fraudster Sam Bankman-Fried. That many contemporary examples of bullshit come from the world of tech is not incidental. There are two aspects of technoculture that make it a fertile field for the bullshitter.

The first is the myth of the solitary genius. Since Thomas Edison, or perhaps even James Watt, it has been fashionable for technology companies to be led by one. You can tell he’s a genius (it’s almost always a man: women like Theranos’s Elizabeth Holmes who play this part often carefully emulate male role models) because he cares little for convention (wearing the same hooded sweatshirt to every meeting), eschews traditional education, and speaks confidently about an impossible range of ­topics. Bill Gates was the ­paradigmatic genius for the computer revolution, and his pivot into global health only helped to re­inforce the myth of technological omnicompetence. But Musk, whose purported expertise spans electric cars, tunnel boring, space exploration and social media platforms, is the best contemporary example of the genius bullshitter.

Frankfurt anticipates this: “Bullshit is unavoidable whenever circumstances require someone to talk without knowing what he is talking about.” He sees the implications for speech by politicians, who are routinely asked to opine on subjects they’ve never been briefed on and then produce sentences designed to be persuasive but devoid of substance. It’s natural for the listener to have a distaste for this.

Unfortunately, it’s much harder for us to call bullshit in the realm of technology, where an intelligent statement about machine learning might be indistinguishable to most ­audiences from marketing drivel. Musk was sufficiently persuasive about self-driving cars—despite Tesla’s setbacks in the field—to create a small company worth more than the entire legacy auto industry. As it’s become clear that Musk is bluffing his way through his management of Twitter, the markets are sensing that his real genius may be in the production of drivel, and Tesla’s share price is being punished.

Crypto promoters are attempting to talk a new reality into being

A second affinity between bullshit and the tech industry is structural: some technologies are bullshit until they are not. Consider the Metaverse proposed by Mark Zuckerberg, an immersive virtual reality space that replaces the flat, screen-based internet. So long as the Metaverse serves only a few million gamers, it’s not all that interesting. Should it achieve hundreds of millions of users, it would become critically important.

Many technical systems experience network effects. They become markedly more useful when embraced by millions or billions of users. Email, used by a few thousand academics and scientists, was an interesting curiosity; used as the default mode of communication for business, it becomes indispensable. Little changed technically about email between 1980 and 2000 other than its scale, utility and importance—which is to say, everything.

Unfortunately, the need for a given technology to achieve network effects means promoters of these technologies have to be extremely persuasive. Cryptocurrency is a prime example. The value of any digital token is what markets will pay for it. When crypto promoters talk about Bitcoin as an answer to the shortcomings of currencies managed by central banks, they are not “making things up, to suit their purpose”, as Frankfurt writes, so much as attempting to talk a new reality into being. “Fake it till you make it” is literally true in some corners of the tech industry.

But cryptocurrency reveals the complexities of a bullshit-rich environment for the consumer. Within the crypto space are true believers who think the banking system would be better if replaced with computer code. There are opportunists, who see money pouring in and want their piece of the action—consider the celebrities from Cristiano Ronaldo to Matt Damon who have endorsed crypto projects. And there are conmen, who see the interest in crypto as a chance to defraud investors with Ponzi schemes updated to take advantage of the new technological landscape. The well-meaning, fake-it-till-you-make-it bullshit of the true believers provides this group with plausible cover.

This year marks a fascinating moment in the history of bullshit. In late 2022, machine learning start-up ­OpenAI released a tool called ChatGPT. Trained on 300bn words on the web, ChatGPT can produce text that is difficult to distinguish from text written by humans. This leads to some cool tricks: you can ask the model to explain Liz Truss’s political career in the form of a Shakespearean sonnet. This facility with form and content makes ChatGPT an extremely efficient bullshitter, capable of producing persuasive but meaningless text.

In addition to freaking out columnists—who assumed that we had a monopoly on speaking confidently about subjects we don’t fully understand—ChatGPT is a nightmare for teachers, who must deal with a new tool for academic dishonesty, now that students can use the platform to generate plausible-looking essays despite having no understanding of a subject. ChatGPT is sufficiently good at copying writing styles that academic essays come complete with footnotes. Unfortunately, many of the works documented in these footnotes have been “imagined” by the system, a phenomenon the industry calls “hallucination”. In other words, ChatGPT knew that academic articles tend to be filled with citations and simply imagined plausible references that could support its own work. (A prediction for 2023: someone clever will ask ChatGPT to start writing the academic papers it’s been citing, and begin submitting them to journals and conferences.)

Perhaps the technology’s neatest trick is that it can generate computer code based on the millions of examples on sites like Stack Overflow, where programmers share tips, tricks and example code. After the tool’s release, people began putting their coding questions to ChatGPT and posting the responses on the website. In late 2022, Stack Overflow put a temporary ban on answers to questions generated by ChatGPT, explaining that while the responses “typically look like they might be good and the answers are very easy to produce”, they actually “have a high rate of being incorrect”.

Beyond the bullshit produced by ChatGPT, there are exciting possibilities. Once such systems can consistently generate running computer code, we may see the rise of so-called “no code” tools, where an AI writes functional computer code from a simple text prompt, radically changing how software is created. Writing a sonnet about Liz Truss is a cool trick; coding a shopping cart system for my online store is an industry-changing development. AI may turn out to be better at producing code than writing academic papers, as determining whether a program runs without syntax errors is far less subjective than producing a good essay.

If we are in the age of the bullshitter, how do we evaluate the potential of new technologies? In 2023, I am less interested in technologists who position themselves as “disrupting” fields through the power of new ideas, and more interested in communities who have been engaged with an issue over a long time. The surest cure for bullshit is exposing it to the scrutiny of people who truly, deeply understand a problem through years of lived experience. Perhaps we can ask the solitary geniuses to step back and make some room.