The brain “tunes in” to signs just like words, but on a different frequency
A study by BCBL reveals that we have a universal mechanism for processing language, regardless of whether it reaches us through hearing or sight
When we hear someone speak, our brain synchronises with the rhythm of the speaker’s voice in order to understand the message. Now, new research led by the Basque Centre on Cognition, Brain and Language (BCBL) has discovered that this mechanism, known as cortical tracking, is not exclusive to hearing: the brain also ‘tunes in’ visually to sign language movements, although it does so at a different frequency.
The study, published in the prestigious scientific journal Proceedings of the National Academy of Sciences (PNAS), provides solid evidence that the human brain uses universal mechanisms to process language, regardless of whether the information arrives through the auditory channel (speech) or the visual channel (signs).
‘Our results confirm that synchronisation between the brain and language is a fundamental characteristic of linguistic processing and that it goes beyond the auditory domain,’ explains Chiara Rivolta, a researcher at BCBL and lead author of the article.
To carry out this work, the research team faced the challenge of measuring the complexity of the visual signal. Unlike speech, where the rhythm is set by syllables, sign language conveys information through the simultaneous movement of the hands, torso and head.
Using motion capture systems (similar to those used in video games and cinema) and magnetoencephalography (MEG) to record brain activity, the team compared two groups of listeners: one consisting of experts in Spanish Sign Language (LSE) and the other of people who did not know this language.
The results showed a difference compared to spoken language. While in speech the brain synchronises at fast rates, in sign language neural activity adjusts to a slower frequency, known as the delta band (0.5–2.5 Hz).
To visualise this finding, the author suggests imagining the brain as a radio. ‘For spoken language, the tuner looks for fast rhythms. However, for sign language, the brain uses a slower tuner that locks onto the broader rhythms of body movements,’ explains Chiara Rivolta.