Nature, nurture and liberal values

Biology determines our behaviour more than it suits many to acknowledge. But people—and politics and morality—cannot be described just by neural impulses
January 25, 2012
Beyond Human Nature by Jesse Prinz (Allen Lane, £22) Incognito by David Eagleman (Canongate, £20) You and Me: the Neuroscience of Identity by Susan Greenfield (Notting Hill Editions, £10)

Human beings are diverse and live in diverse ways. Should we accept that we are diverse by nature, having followed separate evolutionary paths? Or should we suppose that we share our biological inheritance, but develop differently according to environment and culture? Over recent years scientific research has reshaped this familiar “nature-nurture” debate, which remains central to our understanding of human nature and morality.

For much of the 20th century social scientists held that human life is a single biological phenomenon, which flows through the channels made by culture, so as to acquire separate and often mutually inaccessible forms. Each society passes on the culture that defines it, much as it passes on its language. And the most important aspects of culture—religion, rites of passage and law—both unify the people who adhere to them and divide those people from everyone else. Such was implied by what John Tooby and Leda Cosmides called the “standard social science model,” made fundamental to anthropology by Franz Boas and to sociology by Émile Durkheim.

More recently evolutionary psychologists have begun to question that approach. Although you can explain the culture of a tribe as an inherited possession, they suggested, this does not explain how culture came to be in the first place. What is it that endows culture with its stability and function? In response to that question the opinion began to grow that culture does not provide the ultimate explanation of any significant human trait, not even the trait of cultural diversity. It is not simply that there are extraordinary constants among cultures: gender roles, incest taboos, festivals, warfare, religious beliefs, moral scruples, aesthetic interests. Culture is also a part of human nature: it is our way of being. We do not live in herds or packs; our hierarchies are not based merely on strength or sexual dominance. We relate to one another through language, morality and law; we sing, dance and worship together, and spend as much time in festivals and storytelling as in seeking our food. Our hierarchies involve offices, responsibilities, gift-giving and ceremonial recognition. Our meals are shared, and food for us is not merely nourishment but an occasion for hospitality, affection and dressing up. All these things are comprehended in the idea of culture—and culture, so understood, is observed in all and only human communities. Why is this?

The answer given by evolutionary psychologists is that culture is an adaptation, which exists because it conferred a reproductive advantage on our hunter-gatherer ancestors. According to this view many of the diverse customs that the standard social science model attributes to nurture are local variations of attributes acquired 70 or more millennia ago, during the Pleistocene age, and now (like other evolutionary adaptations) “hard-wired in the brain.” But if this is so, cultural characteristics may not be as plastic as the social scientists suggest. There are features of the human condition, such as gender roles, that people have believed to be cultural and therefore changeable. But if culture is an aspect of nature, “cultural” does not mean “changeable.” Maybe these controversial features of human culture are part of the genetic endowment of human kind.

This new way of thinking gained support from the evolutionary theory of morality. Defenders of nurture suppose morality to be an acquired characteristic, passed on by customs, laws and punishments in which a society asserts its rights over its members. However, with the development of genetics, a new perspective opens. “Altruism” begins to look like a genetic “strategy,” which confers a reproductive advantage on the genes that produce it. In the competition for scarce resources, the genetically altruistic are able to call others to their aid, through networks of co-operation that are withheld from the genetically selfish, who are thereby eliminated from the game.

If this is so, it is argued, then morality is not an acquired but an inherited characteristic. Any competitor species that failed to develop innate moral feelings would by now have died out. And what is true of morality might be true of many other human characteristics that have previously been attributed to nurture: language, art, music, religion, warfare, the local variants of which are far less significant than their common structure.

I don’t say that view of morality is right, though it has been defended by a wide variety of thinkers, from the biologist John Maynard Smith (its original proponent) via the political scientist Robert Axelrod to such popularisers as Matt Ridley and Richard Dawkins. But even if morality is a partly acquired characteristic that varies from place to place and time to time, it might still rest on innate foundations, which govern its principal contours.

Noam Chomsky’s speculative linguistics has proved enormously important in this debate, since language is at the root of culture in all its manifestations: it is a paradigm case of a social activity that entirely changes the relationships, capacities, knowledge and the world of those who engage in it. Yet there could be no explanation of language that regarded it merely as a socially transmitted trait, with no deeper roots in biology. The rapid acquisition of language by children, at the same rate in every part of the globe, and on the same paucity of information from the surroundings, suggests that there is an innate universal grammar, to which each child attaches the fragmentary words and phrases that strike his ear, to generate new and intelligible utterances of his own. What Steven Pinker has called the “language instinct” is implanted by evolution, which endows each child with mental competences that are common to our species.

If we follow the evolutionary biologists, therefore, we may find ourselves pushed towards accepting that traits often attributed to culture may be part of our genetic inheritance, and therefore not as changeable as many might have hoped: gender differences, intelligence, belligerence, and so on through all the characteristics that people have wished, for whatever reason, to rescue from destiny and refashion as choice. But to speculate freely about such matters is dangerous. The once respectable subject of eugenics was so discredited by Nazism that “don’t enter” is now written across its door. The distinguished biologist James Watson, co-discoverer of the double helix structure of DNA, was run out of the academy in 2007 for having publicly suggested (admittedly in less than scientific language) that sub-Saharan Africans are genetically disposed to have lower IQs than westerners, while the economist Larry Summers suffered a similar fate for claiming that the brains of women at the top end are less suited than those of men to the study of the hard sciences. In America it is widely assumed that socially significant differences between ethnic groups and sexes are the result of social factors, and in particular of “discrimination” directed against the groups that seem to do less well. This assumption is not the conclusion of a reasoned social science but the foundation of an optimistic worldview, to disturb which is to threaten the whole community that has been built on it. On the other hand, as Galileo in comparable circumstances didn’t quite say, it ain’t necessarily so.

***

We find ourselves, therefore, in the middle of another tense debate, in which it is not religion, but liberal values, which seem to be challenged by the theory of evolution. It is against this background that the philosopher Jesse Prinz has entered the fray, with a big book arguing that there is “little reason to think that biology has a major impact in accounting for human differences.” He patiently examines the arguments given for attributing this or that trait to genetic inheritance, and tries to show either that the research is methodologically flawed, or that the conclusion is not supported by it. I say “patiently,” though I should also add that, when it comes to discussing IQ and sexual differentiation, Prinz intemperately dismisses those like Charles Murray, Richard Herrnstein and Larry Summers who have not been persuaded by the liberal consensus.

Prinz believes that our cognitive powers are awakened only when they have experience on which to get to work. Infants learn to divide the world into kinds by extrapolating from what they feel, hear and see. There are no innate classifications, and no roles or relationships that are not in some sense and to some measure socially constructed. Prinz attacks Chomsky’s claim that there is a universal grammar and dismisses the theory held by Jerry Fodor and others that our mental processes are conducted in a shared “language of thought.” Silent thinking, for Prinz, involves the use of images, which have their source in individual experience, while language is picked up by a spontaneous statistical analysis from which a child derives the rules of grammar. Prinz even goes a little way towards resuscitating the notorious Sapir-Whorf hypothesis, according to which the structure of a person’s language determines the contours of his world. “Language,” he writes, “is an invention, not an instinct… If language teaches us about who we are, the lesson is that we are fundamentally flexible.” Prinz goes on to argue that gender difference is to a great extent acquired, that the distinction between individualists and collectivists is cultural rather than biological, and that emotions are socially constructed from raw material that is innate only because it belongs to basic bodily processes and gut reactions.

All that is argued boldly and with much support from the literature of experimental psychology. But I could not help feeling that it falls short of its target. In The Blank Slate (2002) Steven Pinker assembled the evidence for the conclusion that our fundamental capacities are implanted by evolution and malleable only in those matters in which malleability would confer a reproductive advantage. His argument was meticulous and serious, and the weight of scientific evidence impossible to deny. In this or that particular the science might be faulted or revised, but the broad case is surely compelling. Consider, for example, the division of roles everywhere to be observed between men and women. There is a powerful reason to think that this is rooted in a deeper division of biological labour, selected in the harsh conditions that threatened our ancestors with extinction. For human beings manifest neoteny, the trait of giving birth to helpless large-brained offspring, who can look after themselves only after ten years of nurture and nowadays not even then. Neoteny is a huge evolutionary advantage; but it is purchased at an equally huge biological cost. A species whose young are as vulnerable as human children needs both organised defence and serious home building if it is to reproduce itself. And on those granite foundations has been built the romantic castle of sexual difference.

But there is another reason for being dissatisfied with Prinz’s approach. When the idea of cultural diversity first took root in the German Enlightenment it was associated with the study of the myths, customs and artworks of antiquity, with the exploration of the religions of the east and with visits to the tribal cultures of Africa and America. A kind of imperial reverence for those things animated the minds of those who studied them, and it was with a hint of regret that the early anthropologists recorded the rapid collapse of local cultures under the withering eye of their researches. Prinz belongs to another mindset—one that can be observed in some of the disciples of Boas. He does not have much sympathy for any culture other than the one in which he is immersed—the liberal egalitarian culture of the American academy, which holds that sexual roles are socially constructed, that sexual morality is exhausted by the requirement of consent, and that all “disadvantage” is down to environmental factors which we can collaborate to overcome. He would perhaps deny that this is a culture, rather than a set of rationally held beliefs. But the whole tendency of his argument is to suggest that we can and should live in the way that he lives, not endowing our differences with the status of natural barriers or God-given paths, but opening ourselves to a kind of “soft diversity,” in which human possibilities flourish in a condition of mutual acceptance.

It may be that this is the direction in which we are moving. But for all he says to the contrary it could be that there are obstacles to progress that are fixed in our nature and not to be changed by social adjustment. We are familiar with the feminist charge that women come out worse in maths tests because of unconscious discrimination, stereotyping and other factors that allegedly sap their confidence—an argument that, in the eyes of its proponents, was further proved by Larry Summers’s foolhardy attempt to question it. But does anyone believe that men are ten times as likely to end up in prison as women because of unconscious discrimination or stereotyping? Of course not. We recognise that men are by nature more aggressive and more inclined to settle disputes by violence. And no educated person is likely to dispute the fact that this difference between men and women is genetic. The real question is how far does this kind of genetic influence extend? Susan Greenfield refers to recent brain-imaging research by Ryota Kanai and others at UCL which purportedly suggests that students with conservative political attitudes tend to have larger than normal amygdalae, while among those of liberal persuasion it is the anterior cingulate cortex that stands out. Could this be the proof of WS Gilbert’s proposition, that “Every child who is born alive / Is either a little liberal / Or a little conservative”?

Those speculations bring us to another and far more serious obstacle to the humane understanding of our condition than the one that troubles Prinz. Advances in neuroscience are beginning to suggest that, while the brain is malleable and adaptable, it comes with its own inherent restraints, and with connections that have been wired without our knowledge and definitely without our consent. Hence processes in the brain can affect our decision-making without our being able to counter them. When in 1966 Charles Whitman, a man of previously good character, killed 13 people and wounded 32 more, shooting from the top of the University Tower in Austin Texas, he had already indicated that he felt something was not quite right in his head. After he was shot by a police marksman, an autopsy revealed a small tumour pressing on the amygdala, which neuroscience regards as the seat of the gut reactions through which we protect our space. So was Whitman to blame for what he did? And if not, does this provide me, after decades of reproach for my conservative opinions, with the “amygdala excuse,” just like Whitman?

Taking off from the Whitman case David Eagleman argues that we should revise our sense of legal and moral responsibility, so as to recognise that most of what we do and feel arises from processes over which we have no control. The brain moves incognito beneath our conscious deliberations, like a great ocean liner on the deck of which we walk up and down, imagining that we move it with our feet. Offering his own version of the Freudian story, in the luminous prose for which he is rightly esteemed, Eagleman argues that most of what we do is more influenced by unconscious than by conscious processes, and that concepts like responsibility and freedom cannot survive intact from the advances of neuroscience. Whether it is nature or nurture that wired up the brain, the wiring is for the most part none of our doing, and nothing for which we can be praised or blamed.

Eagleman is too subtle a thinker, and too responsible a person, to draw quite that conclusion. He wants to revise our concept of responsibility so that his kind of responsibility is still contained in it. My brief response, however, is to suggest that he has misdescribed the problem. The picture that he gives, of the fragile “I” riding the elephant of grey matter while pretending to be in charge of it, misrepresents the nature of self-reference. The word “I” does not refer to some conscious “part” of the person, the rest of which is a passive and hidden “it.” The “I” is one term of the I-You relation, which is a relation of accountability in which the whole person is involved. To use the first-person pronoun is to present myself for judgement. It is to take responsibility for a host of changes in the world, and in particular for those for which you can reasonably call me to account by asking “why?” This question is the foundation of a co-operative enterprise, in which we elicit from each other the reasons, meanings and choices that make us intelligible. Understanding the logic of the question “why?” is a task that has been addressed by several recent philosophers—Elizabeth Anscombe, Stephen Darwall, Sebastian Rödl and others. It is the question that underlies the concept of responsibility in the common law. And philosophers have done much to show that the dialogue through which we establish and broker our responsibilities is well founded and not necessarily vulnerable to disruption by our newfound knowledge of the brain.

This point suggests how Prinz might have put philosophy to work on behalf of his conclusions. The real question raised by evolutionary biology and neuroscience is not whether those sciences can be refuted, but whether we can accept what they have to say, while still holding on to the beliefs that morality demands of us. From Kant and Hegel to Wittgenstein and Husserl there have been attempts to give a philosophy of the human condition that stands apart from biological science without opposing it. But those attempts are either not noticed or given short shrift in Prinz’s argument which, by attempting to fight the biological sciences on their own ground, is condemned to a losing wicket.

We are human beings, certainly. But we are also persons. Human beings form a biological kind, and it is for science to describe that kind. Probably it will do so in the way that the evolutionary psychologists propose. But persons do not form a biological kind, or any other sort of natural kind. The concept of the person is shaped in another way, not by our attempt to explain things but by our attempt to understand, to interact, to hold to account, to relate. The “why?” of personal understanding is not the “why?” of scientific inference. And it is answered by conceptualising the world under the aspect of freedom and choice. People do what they do because of events in their brains. But when the brain is normal they also act for reasons, knowing what they are doing, and making themselves answerable for it.

This does not mean that we should ignore what goes on in the brain. In her lively monograph Susan Greenfield emphasises that our brains are plastic and can be influenced in ways that pose a risk to our moral development. Prinz’s defence of nurture against nature may look like a defence of human freedom. But nurture can as easily destroy freedom as enhance it. We can bring up children on passive and addictive entertainments that stultify their engagement with the real world and rewire the neural networks on which their moral development depends. The short-term pursuit of gratification can drive out the long-term sense of responsible agency. Moreover, if children learn to store their memory in computers and their social life in portable gadgets, then gradually both memory and friendship will wither, to linger on only as futile ghosts haunting the digital archives.

I sympathise with those worries. But it does not change the position that a philosopher should adopt. Greenfield’s argument suggests that there is a kind of human development that prepares us, at the neurological level, for the exercise of responsible choice. If we bring up our children correctly, not spoiling them or rewiring their brains through roomfuls of digital gadgetry, the sense of responsibility will emerge. They will enter fully into the world of I and You, become free agents and moral beings, and learn to live as they should, not as animals, but as persons.

Allow children to interact with real people, therefore, and the grammar of first-person accountability will emerge of its own accord. Undeniably, once it is there, the I-to-you relation adds a reproductive advantage, just as do mathematical competence, scientific knowledge and (perhaps) musical talent. But the theory of adaptation tells us as little about the meaning of “I” as it tells us about the validity of mathematics, the nature of scientific method or the value of music. To describe human traits as adaptations is not to say how we understand them. Even if we accept the claims of evolutionary psychology, therefore, the mystery of the human condition remains. This mystery is captured in a single question: how can one and the same thing be explained as an animal, and understood as a person?

The February issue of Prospect is now on newsstands. Find your nearest retailer here




IF YOU LIKED THIS ARTICLE, WHY NOT TRY THE FOLLOWING:

Freud: the last great Enlightenment thinker Sigmund Freud is out of fashion. The reason? His heroic refusal to flatter humankind, says John Gray

The great divide: Cinema, literature and other aspects of western culture are increasingly open to Asian influence. Not so western philosophy, which remains almost entirely sealed off from eastern traditions. Why?

Richard Rorty – He was arguably the most influential philosopher of his time: an American who argued against truth, reason and science. Yet his radicalism turns out to be oddly disarming, writes Simon Blackburn