Jazz-Playing Robots Will Explore Human-Computer Relations

Jazz Musician
(Image credit: Geoff Goldswain | Shutterstock.com)

Jazz-playing computers and robots could soon yield clues about how to help people collaborate with machines, researchers say.

The new project, called MUSICA (short for Musical Improvising Collaborative Agent), aims to develop a musical device that can improvise a jazz solo in response to human partners, just as real jazz musicians improvise alongside one another.

MUSICA is part of a new program from the Defense Advanced Research Projects Agency (DARPA), the branch of the U.S. military responsible for developing new technologies. The project is designed to explore new ways that people can interact with computers and robots. [Super-Intelligent Machines: 7 Robotic Futures]

"There is definitely a desire for more natural kinds of communications with computational systems as they grow in their ability to be intelligent," Ben Grosser, an assistant professor of new media at the University of Illinois at Urbana-Champaign, told Live Science. "A lot of us are familiar with various methods of interacting with computers, such as text-based and touch-based interfaces, but language-based interfaces such as Siri or Google Now are extremely limited in their capabilities."

Grosser and his colleague Kelland Thomas, an associate professor of music at the University of Arizona, are developing MUSICA to explore how people can communicate with one another without language. "That could make interactions between humans and machines a lot deeper," said Grosser, who himself is a jazz trumpeter. "When it comes to jazz, you feel the music as much as you hear and think about it — you react instinctively to things that are going on."

To develop a machine capable of playing improvisational jazz, the researchers will create a database of jazz solos from a variety of musicians and have computers analyze the recordings to figure out the various processes that come into play when a musician improvises. The researchers will then develop a performance system to analyze the components of human jazz performances, including the beat, pitch, harmony and rhythm. The system will also consider what it has learned about jazz solos to communicate and respond musically in real time.

"Our goal is to by next summer present a 'call and answer' system to DARPA, where I can play a line of music, and the system will analyze that line and give an answer as close to real time as possible," Grosser said.

The researchers admit the project may seem unusual.

"Let's face it — trying to develop a system that can play jazz is a crazy idea," Grosser said. "It's not going to be Miles Davis. I think if we can make this thing play like a high schooler, we'll really have done our job."

Ultimately, Grosser hoped this research could shed light on the nature of the creative process. "By finding the limits of computational creativity, we can get a different understanding of human creativity, on our own creative processes," Grosser said.

Follow Live Science @livescience, Facebook & Google+. Original article on Live Science.

Charles Q. Choi
Live Science Contributor
Charles Q. Choi is a contributing writer for Live Science and Space.com. He covers all things human origins and astronomy as well as physics, animals and general science topics. Charles has a Master of Arts degree from the University of Missouri-Columbia, School of Journalism and a Bachelor of Arts degree from the University of South Florida. Charles has visited every continent on Earth, drinking rancid yak butter tea in Lhasa, snorkeling with sea lions in the Galapagos and even climbing an iceberg in Antarctica.