Many musicians cut their teeth playing cover versions of favorite or popular artists, but how would long gone players of yesteryear handle tunes by modern artists like Lady Gaga or Coldplay? How might Beethoven or Bach interpret a Justin Bieber or Miley Cyrus track? Impossible to know. Research out of Birmingham City University in the UK might soon shed some light on such intriguing questions, thanks to a computer algorithm that can analyze the playing styles of legends to determine how they would attack a new piece of music.
Players may learn their trade by emulating the styles and sounds of famous musicians before them, but few (if any) become straight carbon copies. The Rolling Stones started by belting out the blues of Muddy Waters, Fred McDowell and Willie Dixon, but the band's covers are most definitely Stones through and through. The version of Killing Floor by Jimi Hendrix is very different to Howling Wolf's original. And Nirvana's MTV Unplugged rendition of Bowie's Man Who Sold the World is all Kurt Cobain & co.
The system being developed at Birmingham City University looks for subtle differences in playing styles, and then imitates the essence of a particular musician when applied to a new piece of music. Though the artificial intelligence algorithms will continue to learn from hundreds more pieces of music fed into it by the research team led by Senior Lecturer in Sound Engineering Islah Ali-MacLachlan, it's so far been restricted to a diet of traditional Irish flute tunes.
"The initial proving work has been done on traditional Irish flute as I have a great interest in this instrument as well as many types of popular music," Ali MacLachlan told New Atlas. "We were looking to develop tools that would let us accurately detect note onsets, timbre and dynamics as a way to identify player styles and have published several papers in this area. Using bidirectional neural networks, we are able to take each note in the context of what comes before and after it and we would like to try this on some famous guitarists."
The system has so far analyzed music spanning more than 50 years and amassed a database of over 15,000 notes and sounds. The researchers report that the AI engine is already capable of 86 percent replication accuracy and can nail a musician's particular style around 75 percent of the time.
"As the engine relies on a trained neural network, the concept is transferable between instruments and music genres," Ali-MacLachlan said. So with more music pieces loaded in and some fine tuning, it has the potential to determine exactly how a variety of contemporary or modern music pieces might have been performed by long deceased music legends.
We can't wait to hear what the engine reckons Mozart would make of Roll Over Beethoven or how Fats Domino might play Shape of You. What tracks would you like reworked, and by whom?
Source: Birmingham City University
Want a cleaner, faster loading and ad free reading experience?
Try New Atlas Plus. Learn more