Every parent has noticed that their child’s entertainment, play and learning has become increasingly digitalised. Some may not realise that their environment has become increasingly commercialised too. But children are being exploited for commercial gain in a way not seen since the 19th century, argues social research agency Revealing Reality. And government is doing little about it.
Ninety-six per cent of 13- to 17-year-olds and 55 per cent of children aged between three and 12 use social media, with TikTok, YouTube and Instagram among the most popular apps. All are primarily designed to engage users in order to drive revenue, while entertainment is secondary. They prioritise shortform content, appealing to short attention spans (and in doing so, exacerbating them), while children’s data is used to create a tailored “feedback loop”, showing them videos and pictures based on their preferences. The result of this concerted effort to capture kids’ attention? Children told Revealing Reality researchers that they regret how much time they spend online. Twelve-year-old Suzy, for example, voiced frustration at being pulled “down a rabbit hole” for hours at a time, by content she is just “drawn to”.
This data harvesting also allows companies to tailor marketing, with “terrifying” amounts shown to children via social media, says Damon De Ionno, co-owner of Revealing Reality, “particularly cosmetics for girls”. And the scale of advertising is also compounded by its form, for while regular adverts exist online, children tend to recognise these and consider them an interruption to their viewing. When marketing comes via an influencer however, it’s more likely to be absorbed “uncritically”: the advert is the engaging content, and comes from a figure often perceived as a “friend”. Look at Zoella, says De Ionno. She was like “a big sister, talking to you about boyfriends and makeup.” This is the “parasocial” relationship, which has a particular pull for young people.
The Online Safety Act’s primary role is to stop children viewing harmful content, but it does nothing to stop tech being designed to keep children engaged, used as both consumer and product, in this arguably very harmful way.
Children’s rights advocate Baroness Beeban Kidron, who was involved with the passage of the act, has written that she “wholeheartedly” endorses this criticism and has argued that the focus of debates should not be on content but on “the power of algorithms that shape our experiences online.” This power could be used constructively, she said, to create a less toxic environment, instead of being “fixed on ranking, nudging, promoting and amplifying anything to keep our attention, whatever the societal cost.”
Noise and action are things that are currently amplified to keep young people’s attention—perhaps explaining why MrBeast, who films himself in deprived communities supposedly “CURING PEOPLE’S BLINDNESS!!!”, for example, has 452m YouTube subscribers and is a billionaire. This kind of material sucks children in. “It goes back to our monkey-brain,” says Beckii Flint, co-founder of social marketing agency Pepper. Kids are instinctively drawn to often high-volume and extreme pranks, and content creators feed the algorithm, unfettered by any quality control standards. “We can’t have that happen at the expense of children,” Flint says.
Many children are also behind the camera. There are no official statistics but the number of children involved in the influencer industry is growing, according to Francis Rees, law lecturer and coordinator of the Child Influencer Project at the University of Essex. And while in 2022, the parliamentary Select Committee on Digital, Culture, Media and Sport recommended rules to protect them, nothing has been implemented, nor are children covered by the Online Safety Act, which Rees says treats them as “end users” only. Flint was a viral YouTuber in her teens and has since launched the Responsible Kidfluence Code, which asks everyone using children for online commercial content to uphold certain standards. Most parents involved in such activity are acting in good faith, she says, but not enough is being done to protect children.
Kidron argues that parents need more help, too. We “cannot be expected to bear the entire burden of keeping [our] children safe online,” she says. The idea that parents are solely responsible is both an inversion of roles (supermarkets don’t ask us to check their food is safe, for example) and impossible, De Ionno tells me. Take gaming and social media platform Roblox, which is marketed for children but has been found to be a “disturbing” place more than once, as adult users are able to contact children directly, while children have seen sexually suggestive and violent content, even with parental controls turned on. Parents ask De Ionno if they can block the site’s inappropriate games, he says, perhaps unaware that there are six million games to vet.
Game designs are often led by a money-making model. In the case of Roblox, that means keeping its reported 151.5m daily users—around 40 per cent of whom are under 13—as engaged as possible so that they are willing to buy “Robux” money to pay for special abilities, and other add-ons. One 14-year-old interviewee, Zak, told Revealing Reality that he spent most of his weekly £20 pocket money this way. “You have some of the biggest budgets and finest minds working out how to capture and keep people’s attention and they’re very good at it,” says De Ionno. “By comparison, political and regulator responses are incredibly slow.”
Government lacks curiosity about this aspect of online harm, he says. “They’re still struggling to work out how to stop young people seeing porn,” and appear some way from asking: “How much does TikTok or Roblox, or any of these companies make from a child, per hour?”
Most of us want to believe that tech is okay. Kids are expanding their knowledge via Wikipedia! They’re learning skills to equip them for the digitalised future! But the evidence of its benefit is limited, at best. There is little to vouch for the benefits even of digitalised learning, which has seeped into schools via platforms like Google Classroom. While more than 500,000 apps are labelled “educational”, there is no threshold to meet to be described as such. Widespread adoption of such apps often isn’t about learning, playing or socialising, but about ease, says De Ionno. “We’re suckers for it as a species, but it’s not serving us well.”
None of us have chosen this state of affairs, says the report: we’ve “sleepwalked” into it. But unless government acts, technology companies will continue to profit while our kids lose their childhoods, with the ramifications lasting well into adulthood.