FYI.

This story is over 5 years old.

Music

Is Music a Language and Can It Teach Us How to Speak? We Asked a Neuroscientist

Turns out that jazz kid you hated on at school might have been onto something.

This article originally appeared on Noisey Sweden.

Remember that slightly lonely guy you used to be semi-friends with? The one that jammed with the high school jazz band every other Friday and always voluntarily wore a spotted bow tie? When he was going on and on about "expressions," "phrases," and how jazz improvisation was all about understanding "the language," you probably just rolled your eyes and beat him up.

Advertisement

But, as is ever the way, it turns out the nerds were on to something. We all vaguely understand the general concept of written instrumental music, from Chopin to Miles Davis, with its own rules, symbols and colossal scope for error—just like any other language. But how is it that hearing a piece of jazz, classical, minimalist, or pop music can pull off the same rousing, emotionally charged effect on pretty much anyone, regardless of whether they can read music or not.

Only a small percentage of people can read musical notation nowadays, but almost everyone understands the message a piece of music is trying to create. Why is it that we can understand its tones, moods and intentions, without understanding its language? Is music some kind of universal system of communication, endowed with special powers?

Neuroscientists have been wondering how the brain processes music, and what consequences this might have for questions like the above. They investigate whether the brain processes melody and rhythm in the same way it does words and sentences. There's also a debate as to which comes first in our development: our natural disposition for languages or an inherent sensitivity to music? And can we therefore use music to help people with language impediments?

Dr. Dominique Vuvan, a researcher based at the International Laboratory for Brain, Music, and Sound Research (BRAMS) in Montreal, has recently worked on a review paper entitled Neural Overlap in Processing Music and Speech, in which her laboratory assess the evidence for 'sharing' between the processing of music and language in the brain. I called her up last week to find out more about the magical effect of music on the brain, and why jazz is particularly special.

Advertisement

Noisey: Hi Dominique! So, tell me in terms I would understand, what does it mean to talk about an overlap between the way the brain processes music and language?
Dominique: We might have a language test that lights up a certain part of the temporal lobe, and then we might have a music test that lights up that same part of the brain. So you have what looks like the same activations between the language test and the music test. It's these instances that make people say there is overlap between how we process music and language.

The problem is that sometimes when people say 'overlap', what they're really trying to say is, "sharing." The conclusion they're trying to make is that music and language depend on the same neural resources in order to be processed in the brain, but that might not strictly be true.

Right.
A lot of this work comes from around ten or fifteen years ago, when a psychologist called Stephen Pinker came to a music cognition conference and gave a keynote. He referred to music as "auditory cheesecake" and was basically arguing that music is not functional for anything, it's not useful and it's not important; it's just piggy-backing on top of our capacity for language. And it's sentiments like that that have fuelled research into the overlap between music and language over the last decade.

So he said language came before music, or "auditory cheesecake," in evolutionary terms. Is that what you think, that we've evolved with the natural disposition for language and that music is just a happy but slightly useless byproduct of this?
I don't think we know enough to be able to make a distinction between the two. But that's one of the overwhelming questions driving this research. You can look it from Pinker's perspective, which is that language is the most important and most functional ability. But you can also look at it the other way around. If you think about what is involved in processing language, you can see that language actually has extra processes. For example, music doesn't have semantics like language does and so it requires another level of processing. So from a certain viewpoint, we can say that music is actually more primitive than language. Which means some people can say that language is actually piggy-backing on what we might have had before for processing music and sound.

Advertisement

If we can use music to communicate emotion and feeling, then isn't music a language in some limited sense?
You can certainly say that it's a language in a metaphorical sense, in terms of communication and in terms of emotion. There's a lot of research that looks at the overlap between music and language in terms of the ways that emotion is represented between the two and there's conflicting evidence that sometimes there are shared resources and shared processing, whereas sometimes there's not.

via

There was another study (from Johns Hopkins University) which found that during jazz improvisation, those areas of the brain that process the grammatical structure of sentences were operative, but those that process the meaning of language were shut down. Now I know music doesn't have verbs and nouns, that kind of thing, but in terms of comprehension, is deciphering music sometimes the same as deciphering the written word?
So, there are some really interesting results in that study. Improvisation seems to have activated the syntax processing areas on both sides of brain in what's called the inferior frontal gyrus (IFG). What you usually see is that language studies activate the left IFG ("language syntax processor") and music studies activate the one in the right hemisphere ("music syntax processor"). Here we see both right and left IFG activated, suggesting that improvisation calls on something more than the usual music perception tasks do. This "more" could very well have something to do with grammatical processing.

So in short, we definitely have some sort of overlap between how we process music and language, but more research needs to be conducted into where and how great it is. Could future research start to help people learn literacy and languages?
Basically, the general idea there is that if we are able to find situations in which there is actually neural sharing between music and language and therefore cognitive sharing between music and language, it will help us use one domain in order to help the other one. It's difficult to pin down what exactly those points of sharing really are, but the research is sort of headed in that direction.

Go on.
So you can imagine, for instance, that if you had somebody who has a stroke that wipes out their language faculties, you might be able to use some sort of musical therapy in order to help them. And actually, this type of therapy exists: the most common one is called Melodic Intonation Therapy, where you rehabilitate stroke patients by singing sentences to them in order to help them use musical processing to sort of scaffold their recovery of language.

Wow, that would be genuinely incredible! Thanks for speaking with me Dominique.

Huw Oliver is on Twitter.