Robot band Compressorhead has already proven itself more than capable of rocking festival crowds, but its players merely pound out pre-programmed cover songs. Maybe it's time to add a marimba playing bot to the line up, one that uses deep learning to create its own compositions and then bash them out on the wooden blocks. Researchers at the Georgia Institute of Technology have developed just such a robot, and its name is Shimon.
Shimon honed its craft by being placed on a fulfilling diet of complete songs from pop, classical and jazz artists and composers like Beethoven, the Beatles, Lady Gaga and Miles Davis. Over 2 million music fragments (such as motifs, riffs and licks) were then added to the stew before the bot had enough reference material to be able to improvise some mallet bashing over pre-composed chord progressions.
GET 30% OFF NEW ATLAS PLUS
Read the site and newsletter without ads. Use the coupon code EOFY before June 30 for 30% off the usual price.BUY NOW
The research team now reports that if Shimon is given a little help at the start, it can take what's provided and compose something of its own making – something which the researchers say can't be predicted.
"Once Shimon learns the four measures we provide, it creates its own sequence of concepts and composes its own piece," said Ph.D student Mason Bretan, who has been working with the robot for the last 7 years. "Shimon's compositions represent how music sounds and looks when a robot uses deep neural networks to learn everything it knows about music from millions of human-made segments."
In addition to not knowing where Shimon will take the composition, Bretan says that he's been unable to determine which songs in the database are being used as a reference, though certain influences and techniques can be recognized.
The robot's playing skillset appears to have come on leaps and bounds too, moving from monophonic improvisations to being able to belt out harmonies and chords. And it seems to be taking a more human approach to its musicianship.
"When we play or listen to music, we don't think about the next note and only that next note," said Bretan. "An artist has a bigger idea of what he or she is trying to achieve within the next few measures or later in the piece. Shimon is now coming up with higher-level musical semantics. Rather than thinking note by note, it has a larger idea of what it wants to play as a whole."
The Georgia Tech team documented the robot's first two compositions on video. They're both 30 seconds in length and can be seen below. What do you think? Should today's great (human) composers be worried? Let us know in the comments.
Source: Georgia Tech