Computer reads brain activity to ID the song a patient is listening to
Researchers from the D'Or Institute for Research and Education have used machine learning to train a computer to identify what song a participant is listening to by analyzing brain activity. The study, published in Scientific Reports, aims to advance brain decoding for future communication with patients without spoken words.
A total of six volunteers listened to 40 pieces of classical, rock, pop and jazz music while undergoing magnetic resonance imaging (MRI). The MRI identified the neural fingerprint of each song in a participant's brain while a computer simultaneously learned the specific patterns occurring during each song. The computer included tonality, dynamics, rhythm and timbre in its analysis for an improved recall.
Researchers found the computer was 85 percent accurate. For the second part of their experiment, researchers promoted the computer to identify songs when given the choice of 10. In this case, the computer was able to identify the song in 74 percent of decisions.
Researchers hope this brain decoding technology could one day be used to provide alternatives in understanding neural functioning and interact with artificial intelligence.
"Machines will be able to translate our musical thoughts into songs,” said Sebastian Hoefle, researcher from D'Or Institute and PhD student from Federal University of Rio de Janeiro, Brazil.