Robotics

MUSICA project aims at getting computers and humans to create freeform jazz together

MUSICA project aims at getting computers and humans to create freeform jazz together
Scientist hope to advance robot/human communication, by having computers and people collaborate on jazz music
Scientist hope to advance robot/human communication, by having computers and people collaborate on jazz music
View 1 Image
Scientist hope to advance robot/human communication, by having computers and people collaborate on jazz music
1/1
Scientist hope to advance robot/human communication, by having computers and people collaborate on jazz music

Researchers from the University of Illinois are developing a computer system capable of communicating with humans through the medium of jazz, playing improvised pieces in real time. The project forms part of DARPA's Communicating with Computers (CwC) program, approaching the development of robot communication skills from a very different direction.

The overriding goal of DARPA's CwC program is to build computers systems that think and communicate more like humans. Early experiments in the program focused on linguistics and judgement, tasking computers to come up every other sentence in a story (with the other half told by a human), or make viability judgements based on large pools of data. The new project, known as the Musical Improvising Collaborative Agent (MUSICA), looks to music for inspiration.

The idea of getting computer systems to compose music isn't actually new – researchers have been creating algorithms to allow computers to create compositions for decades. But the University of Illinois researchers, led by Professor Benjamin Grosser, are aiming to take things a little further, turning robots into performers.

To build the musical computer system, the team is working to create a database of jazz solos, computationally analyzing them using image schema – a method of understanding the world through spatial concepts. In a musical sense, this might involve someone playing "inside" a piece – with their notes fitting comfortably within the song's chord structure – or "outside" – pushing the boundaries of what sounds comfortable.

The second part of the equation is a system that allows the computer to analyze what someone is playing in real time, understanding key aspects such as beat, pitch, harmony and rhythm. The real time information will be fed into the already assimilated knowledge on jazz solos, with the system then performing a "call and answer" style interactions with a human player.

MUSICA an ambitious project, and that's something that the researchers are aware of.

"The ultimate goal is that a human performer should perceive what it hears from the system as musical communication," says Grosser. "I don't expect our jazz robot to be a Miles Davis. Maybe it can be a high schooler, if we can really nail it."

If you're interested in a having a casual jamming session with your PC, you'll likely have quite a wait on your hands. MUSICA is conceived as a five-year project, and it's still in the early stages.

Source: University of Illinois via IEEE Spectrum

Researchers from the University of Illinois are developing a computer system capable of communicating with humans through the medium of jazz, playing improvised pieces in real time. The project forms part of DARPA's Communicating with Computers (CwC) program, approaching the development of robot communication skills from a very different direction.

The overriding goal of DARPA's CwC program is to build computers systems that think and communicate more like humans. Early experiments in the program focused on linguistics and judgement, tasking computers to come up every other sentence in a story (with the other half told by a human), or make viability judgements based on large pools of data. The new project, known as the Musical Improvising Collaborative Agent (MUSICA), looks to music for inspiration.

The idea of getting computer systems to compose music isn't actually new – researchers have been creating algorithms to allow computers to create compositions for decades. But the University of Illinois researchers, led by Professor Benjamin Grosser, are aiming to take things a little further, turning robots into performers.

To build the musical computer system, the team is working to create a database of jazz solos, computationally analyzing them using image schema – a method of understanding the world through spatial concepts. In a musical sense, this might involve someone playing "inside" a piece – with their notes fitting comfortably within the song's chord structure – or "outside" – pushing the boundaries of what sounds comfortable.

The second part of the equation is a system that allows the computer to analyze what someone is playing in real time, understanding key aspects such as beat, pitch, harmony and rhythm. The real time information will be fed into the already assimilated knowledge on jazz solos, with the system then performing a "call and answer" style interactions with a human player.

MUSICA an ambitious project, and that's something that the researchers are aware of.

"The ultimate goal is that a human performer should perceive what it hears from the system as musical communication," says Grosser. "I don't expect our jazz robot to be a Miles Davis. Maybe it can be a high schooler, if we can really nail it."

If you're interested in a having a casual jamming session with your PC, you'll likely have quite a wait on your hands. MUSICA is conceived as a five-year project, and it's still in the early stages.

Source: University of Illinois via IEEE Spectrum

No comments
0 comments
There are no comments. Be the first!