Moulding young minds

Digital culture does not ruin children’s brains. In fact, it may help them learn better
February 24, 2010

A literate brain is different, structurally, to an illiterate one. How these differences arise is almost impossible to trace during childhood, when the brain is changing for all manner of reasons. But experiments comparing literate and illiterate adults show a link with the size of the angular gyrus, an area of the brain associated with language, as well as different and more intense patterns of mental activity elsewhere.

We have long accepted literacy as a fundamental building-block of civilisation. Today, however, neurologists face related questions which are deeply troubling to many observers: if literacy changes our brains, what will a digitally literate brain (one shaped by interactions with digital media such as computers and videogames) look like—and what could this mean for the way we learn?

The evidence is thin, especially on the question of whether a childhood “screen culture” is developmentally damaging. Yet tantalising neurological research is beginning to emerge that uses interactive media to give us a more precise understanding of the workings of the brain and, in particular, the mechanisms underpinning memory, learning and motivation.

The NeuroEducational research network, headed by neuropsychologist Paul Howard-Jones at Bristol University, is at the forefront of this work. Using a mix of brain imaging and mathematical modelling, Howard-Jones hopes to unpick how technology and learning interact. His aim is to build models of how the brain learns during digital interactions—be they playing computer games or simply using a computer for homework—and, by doing so, create new ways of crafting digital media that actively enhance the workings of the brain.

One experiment uses a computer-based quiz where participants answer multiple- choice questions and, if they get them right, earn a variable number of points. When they get questions wrong, the correct answer is revealed, encouraging them to learn it—the point being to find out the circumstances in which participants are most likely to remember this answer, and get a question right the next time around.

The experiment is being used to test different mathematical models of reward mechanisms in the brain: mechanisms related to levels of dopamine in the striatum, a part of the forebrain buried beneath the cortex. Dopamine is a neurotransmitter associated with reward-seeking behaviours—and Howard-Jones has been able to apply a computational model that successfully predicts improved rates of learning in subjects taking this test.

This work provides the beginnings of an empirical model of how we learn on a neural level—one potentially able to predict, as Howard-Jones puts it, “on a moment-by-moment basis… when people are engaged and learning.” A more precise neurological understanding of reward and memory is important for many reasons. For instance, it allows researchers to calibrate precisely the probability of receiving rewards in order to maximise dopamine release. (In primates, this level has been shown to peak at a 50 per cent chance of success.) Using such results to optimise learning experiences on a neurological level is a potentially transforming predictor of attention and recall.

Another related development lies in the potential of new learning technologies to overcome our understandable aversion to risk and failure. Digital environments are especially good at introducing uncertainty—and the prospect of failure—in a way that, Howard-Jones argues, “doesn’t have a negative impact on self-esteem.” A game or quiz, when properly designed, creates a type of risk that people are at ease with: getting answers wrong doesn’t become a barrier to effort.

Perhaps the most exciting results of all lie in an area known as “working memory”—that is, the ability to remember short-term information, like holding a phone number in your head, or remembering the name of someone you just met. Experiments in this field by a Swedish professor of neuroscience, Torkel Klingberg, have shown that computerised training of those with the behavioural disorder ADHD can significantly improve reasoning abilities and attention span. The system developed by Klingberg, known as Cogmed, offers an “adaptive experience” tailored to each user, designed to extend working memory over time via a series of exercises pushing the limits of their abilities. Other research led by Susanne M Jaeggi at the University of Michigan suggests, meanwhile, that the training of working memory in adults can boost IQ. It’s early days, but the potential of the field is vast.

An awareness of this potential is, of course, what underpins the most commonly-articulated fears about the dangers of screen culture in childhood: that it can be isolating, excessively compelling and damaging to human relationships. There is some truth behind this, not least in the fact that it is dangerously easy to spend many hours in front of a screen every day. Self-evidently, children should not spend all day using computers, just as they shouldn’t spend all day playing sport. But there is far less evidence of social harm being done by digital media to children, let alone to their brains, than is often assumed. “I don’t think we have any evidence to suggest that technology is infantilising anyone,” Howard-Jones notes. “The evidence is that teenagers who use social networking sites to sustain existing relationships are better connected socially in real terms than those who don’t.”

As yet we know little about the long-term neurological impact of technology. But the prospect of using neurology to understand how our brains respond to information and how we learn, especially as children, is fast becoming a reality. Given that the defining quality of all digital media is its interactivity, making these interactions precisely responsive to the nature of our minds is one of the 21st century’s most exciting frontiers. And it’s one that promises not simply to chain us to our desks, but also to teach us new ways of engaging with each other and the world.