University of Rochester

Emotion Detector Soon to Scour the Internet for Music

July 14, 2003

Some days on that drive home from work you feel like listening to something uplifting and lively, while on other days you need a dose of musical angst to get you through the commute. Knowing which pieces of music fit your mood at any given moment is easy--at least for you. For a computer trying to cater to your mood swings, however, it's a nightmare.

Computer scientist Mitsu Ogihara at the University of Rochester has applied for a patent on software that can tell whether a piece of music is jazz or classical or rock, but also can tell what its emotional tone is as well. Imagine telling your stereo that you'd like some quiet mood music and enjoying an evening of songs from almost any genre but all of which stay faithful to your mood. Ogihara is hoping that someday people won't just dial up radio stations--they'll dial up their emotions.

"We're looking to close that last gap between computers and music," says Ogihara. "Computers play music, download it, record it, manipulate it, but they're terrible at having a feel for it."

Ogihara's software works differently from most that have previously been designed to detect the genre of a piece of music. Whereas programs have tried to listen to drum beats, melodies, or complexity, Ogihara's take a brute force approach. His software scans about 30 seconds of a piece of music and looks strictly at the statistics it gathers from the sound's waveform. The program looks at how often similar spikes happen, at the relationships between certain parts of the waveform, and other characteristics to build a statistical profile of that stretch of music.

As it turns out, the emotional content of music correlates very closely with these relationships, so Ogihara's software, while still in the early stages of development, is able to correctly pin down its emotional evocation 64 percent of the time, and categorize its genre with an unprecedented 78 percent accuracy. A recent study of humans' ability to sort music into genres found that we fared worse with only 71 percent accuracy.

"We're still looking at some ways to improve the accuracy," says Ogihara. "We'd like to push the precision to 90 percent before we think about bringing this to industry."

He also sees uses in down-to-earth applications. The Library of Congress is interested in his work in part to help its members more efficiently categorize music by genre, something that takes a tremendous amount of time and energy when compared to categorizing a book, which generally has all its relevant cataloging information right on the sleeve. Organizing your personal music collection should also become much less tiresome as your computer would be able to scan through your entire MP3 collection and sort each song into its appropriate genre for you.

Tao Li and Stephen Li, students of Ogihara at the University of Rochester, also contributed to the research.