Artists and musicians beware—AI has got much further than you might thinkby Philip Ball / March 4, 2019 / Leave a comment
In 2012, Iamus released a CD of classical music performed by the London Symphony Orchestra. Music critic Tom Service was somewhat dismissive, calling Iamus’s composition Hello World! “so unmemorable, and the way it’s elaborated so workaday, that the piece leaves no distinctive impression.”
But it wasn’t a bad debut really—when you consider that Iamus is not a composer but a computer algorithm, developed by researchers at the University of Málaga in Spain.
Marcus du Sautoy, mathematician and Simonyi Professor for the Public Understanding of Science at Oxford, doesn’t talk about Iamus in his new book The Creativity Code, but the question he poses about such efforts amounts to this: can we call Iamus a composer? When Iamus’s compositions were played to musically informed listeners, they were unable to distinguish them from music in a similar (modernist) style composed by humans. In a classic 1950 paper, Alan Turing asked: “Can machines think?” If we were unable, in a remote conversation, to distinguish the responses of a person from those of a computer, then we might have to acknowledge that they could. This AI system, then, passes a musical version of the Turing Test.
Du Sautoy’s test is different but no less challenging: can machines be genuinely creative? The interest, just as it was for Turing, lies not so much in finding a definitive answer but in examining what the question itself might mean. No one believes that computer algorithms like Iamus, or any of today’s artificial intelligence systems, really do “think.” They don’t have any consciousness at all. Typically they employ machine learning, in which the algorithm is trained by letting it search for regularities and patterns in a body of “training” data—classical compositions, say—and then apply those rules to generate new “ideas” in the same style.
But this is not so different from how our own cognition works. We learn our native language by experience and inference, not by formal instruction about its rules (although some of that may refine our usage). It’s the same with music, literature and art: we study examples of what others have done, then generalise what we learn and draw on the “rules” in order to produce our own versions.
With machine learning as with human learning, sticking too…