Could a PC one day write The Phantom of the Opera? A team of British scientists and composers certainly hopes so.
The upcoming West End musical Beyond the Fence has many elements in common with classic musicals: a war-torn setting, a love story and cute kids in the cast. But its creation has been anything but ordinary.
The book and score for the show were written and composed by a human duo, Benjamin Till and Nathan Taylor. But, in a first-of-its-kind experiment, many of the show’s dramatic and musical elements were also developed using computers and digital data.
Dr. Cat Gale, a producer at Wingspan Productions, the company behind Beyond the Fence, told the Observer that she hoped the show, and its method of creation, would help illustrate big concepts.
“We wanted to explore questions around creativity and art in ways that had not been done before, and figure out where we got to and where we’re going,” Dr. Gale said.
The partnership between man and machine began with an analysis of musical theater successes of the past. With help from a group of Cambridge University scientists, the creative team used big data to analyze everything from cast size to the presence of love and death in certain musicals, to give their show the best chance at success.
Once the initial experiment was complete, the team visited the What-If Machine at Goldsmith’s, University of London to develop the plot of the show. The WHIM is a software system designed to “invent, evaluate and present fictional ideas with real cultural value,” according to the program’s website.
The creators decided that the What-If premise with the most promise was “What if a wounded soldier had to learn how to understand a child in order to find true love?”
Next, the team used PropperWryter software developed by Professor Pablo Gervás of Complutense University of Madrid to generate plot structures. They decided to set the story at Greenham Common, a peace camp established in Berkshire, England in 1981 to protest nuclear weapons. The main character is one of the women living at the camp, who gets help from an American airman when she risks having her child taken from her.
As for the score, Messrs. Till and Taylor got help from Dr. Nick Collins of Durham University, whose computer composition system Android Lloyd Webber was programmed with machine learning analyses of musical theater compositions.
“The system had a large corpus of music, and went into the mathematics of musical theater to find the best melodies,” Dr. Gale explained. “Ben and Nathan turned the material into a show which will be big on the stage.”
The humans have now taken over, rehearsing the piece in advance of its run at London’s Arts Theatre from February 22 to March 5, 2016.
Dr. Gale said that the audience would be able to engage with the show on a human level, even though it had been partially created by computers. She illustrated her point with an analogy to Britain’s most famous music group.
“If you have a Lennon and McCartney song, it’s a Lennon and McCartney song,” Dr. Gale said. “It’s not one or the other. This show is the same—it’s been created by computers and curated by humans, but everyone can take something from it.”
According to Dr. Gale, the time is ripe to discuss how theater and technology can work together rather than be separate entities.
“It feels like a very relevant time to be asking the question about why we have such a cultural distinction between the arts and the sciences,” Dr. Gale said. “What’s to be gained from creating that divide? If technology can enhance or expand our creative horizons, why is that such a bad thing?”
If you can’t make it to London this winter, don’t fret—the entire experiment is being chronicled on the upcoming TV series Computer Says Show.