How the brain distinguishes language from music

Some seemingly simple questions are actually an opening to much larger questions. Christina Vanden Bosch of Nederlanden from the University of Toronto made the observation. She was a cellist at an early age, became a neuroscientist and for a long time wondered how the brain could understand the differences between music and language. “We know that children from the age of 4 can easily distinguish between music and language, she remembers. While it seems fairly obvious, there was little to no data encouraging children to make these types of distinctions.” She creates a remedy in a study that was published on the occasion of the annual conference of the Society for Cognitive Neuroscience (CNS) in San Francisco.

Between infants and adults, opposite results

This work is the result of an experiment conducted with 4-month-old infants: they heard words and songs pronounced either in a slightly singing voice, similar to the one we use to address children, or in a monotone. During this time, researchers recorded electrical activity in the brain using an electroencephalogram (EEG). They find that young children follow spoken sentences more successfully than they sing, but the opposite is true for adults: they integrate the words better when they are sung. The researchers also note that tone and rhythm affect brain activity. According to you, thelack of stability of the soundis an important acoustic feature for directing attention in infants. According to Christina Vanden Bosch of the Nederlanden, pitch stability can help a listener identify a song, and conversely, instability tells the infant that they are hearing a person speaking rather than singing.

Understand how people feel to differentiate between music and language

In an online experiment, the scientist and her colleagues asked children and adults to qualitatively describe how music and language differ.Both children and adults identified features such as tempo, pitch, rhythm as important features to distinguish between speech and singing., She says. “This gave me a large data set that says a lot about how people think music and language differ acoustically, and also how the functional roles of music and language differ in our daily lives.”

Future clinical applications?

Understanding the relationship between music and language can “Help explore fundamental questions in human cognition such as: B. why people need music and language and how people communicate and interact with each other through these forms” believes Andrew Chang, one of the participants in this CNS meeting. These findings also pave the way for new clinical trials that might be interested in music, for example, as an alternative form of verbal communication for people with aphasia who have lost all or part of their ability to speak, say.

Interested in this topic? Come and discuss it in our forum!

Leave a Comment