What’s cookin’, daddy-o?
Jamming jazz musicians process music much like they do spoken language, a new study shows.
Researchers in the US found that when hip cats get their groove on for a spot of improvised boogie woogie they use the parts of their brains typically associated with interpreting verbal language and syntax.
The team at Johns Hopkins University’s School of Medicine in Baltimore tracked brain activity as two jazz musicians engaged in the act of ‘trading fours’, or improvised instrumental exchanges.
As they engaged in a musical to-and-fro, the players created a sort-of improvised musical conversation that sparked activation of brain areas used to interpret the structure of phrases and sentences.
The John Hopkins team said their research suggested the brain uses its syntactic areas to process communication in general – whether it’s language or music.
They recruited 11 highly proficient male jazz musicians aged 25 to 56 and used MRI techniques to monitor their brain activity when jamming with each other.
The improvisation activated areas of the brain linked to syntactic processing for language, called the inferior frontal gyrus and posterior superior temporal gyrus.
On the flipside, the musical exchanges deactivated brain structures involved in semantic processing or the meaning of language, called the angular gyrus and supramarginal gyrus.
“When two jazz musicians seem lost in thought while trading fours, they aren’t simply waiting for their turn to play,” said Dr Charles Limb, an associate professor in the Department of Otolaryngology at John Hopkins and a jazz musician himself.
“Instead, they are using the syntactic areas of their brain to process what they are hearing so they can respond by playing a new series of notes that hasn’t previously been composed or practiced.”
The relationship between language and music is complex, and previous studies have only been able to show how the brain processes auditory communication in the context of spoken language.
“But looking at jazz lets us investigate the neurological basis of interactive, musical communication as it occurs outside of spoken language,” Dr Limb said.
“We’ve shown in this study that there is a fundamental difference between how meaning is processed by the brain for music and language. Specifically, it’s syntactic and not semantic processing that is key to this type of musical communication. Meanwhile, conventional notions of semantics may not apply to musical processing by the brain.”
The research is published in the journal PLOS ONE.