Babies are ready to learn their native language before they are born

0
86
Babies learn faster the language they hear most frequently in the womb, according to a study that measured the brain waves of 33 newborns and reveals that linguistic experience influences the brain before birth.

When adults try to learn a language, we are often amazed at how easily babies can pronounce their first words during their first year of life, and new research has been able to explain this ability, since it has found that before At birth, babies’ brains prepare to acquire the language to which they have been most frequently exposed in the womb.

The findings provide “the most compelling evidence to date that language experience already shapes the functional organization of the infant brain, even before birth,” the researchers told the American Association for the Advancement of Science ( AAAS).

Most newborns are considered “universal listeners,” meaning they are capable of learning any human language, but by the time they turn one year old, children’s brains specialize in the sounds of their native language. Although this first year is key for language development, research suggests that prenatal experience can also help lay the foundation for speech and auditory perception.

Between five and seven months of pregnancy, the fetus can begin to hear sounds outside the womb and it has been proven that just a few days after birth babies show that they prefer their mother’s native voice and language. Additionally, newborns can recognize rhythms and melodies they hear in the womb, and prenatal exposure to music can help them develop musical skills, but it has not been clear whether the same can be said for language.

In the new study, University of Padua Padua Neuroscience Center student Benedetta Mariani and her colleagues found that sleeping babies who were most recently exposed to their mother’s native language showed brain signals associated with speech and language learning. long-term. Their findings have been published in Science Advances.

Babies identify the language they heard in the womb

The researchers recruited 33 native French-speaking pregnant women from the maternity ward of the Robert Debré Hospital in Paris, where they used a technique called encephalography (EEG) to monitor their babies’ brain waves between one and five days after birth. “In adults, we know that a series of neuronal oscillations or brain waves play a role in the understanding of speech and language,” explained co-author Judit Gervain, professor in the department of social and developmental psychology at the University of Padua and researcher. principal of the Integrative Center for Neuroscience and Cognition, CNRS and Université Paris Cité. “Waves oscillating at different frequencies align with the rhythms of different speech units, such as the syllable or individual speech sounds.”

“The effect of linguistic experience before birth turns out to be a determining factor in language processing and acquisition already during the first days after birth”

The researchers used EEG to determine whether this brain architecture present in adults with much more linguistic experience was already present to some degree in the newborn brain and, if so, whether the rhythms their brains produce align with the rhythms of the language they produce. they heard most frequently in the womb.

While the babies slept, the researchers played French, Spanish and English versions of the children’s fairy tale Goldilocks and the Three Bears in various orders, each series beginning and ending with three minutes of silence, which is when they recorded brain waves. of the babies, who had previously been fitted with caps containing 10 active electrodes placed in areas overlapping brain regions associated with auditory and speech perception in children.

The electrodes measured electrophysiological activity as frequency signals, helping researchers determine whether listening to these languages ​​activated brain waves associated with processing different elements of speech, such as theta oscillations (4 to 8 Hertz), which are related to hearing syllables, or gamma oscillations. (30 to 60 Hertz), which are related to different units of sound known as phonemes.

The EEG signals were processed using a method that helps measure the degree of “memory” (long-range correlations) contained in them, Mariani explained. “In our case, this measurement showed evidence of language learning, that is, changes lasting effects on brain dynamics after exposure to language, specifically after language heard before birth,” Mariani said.

Raquel Fernández Fuertes, director of the Language Acquisition Laboratory of the University of Valladolid (UVALAL), who has not participated in the research, explained in statements to SMC Spain that in this study: “The babies successfully identify the language they that have been exposed prenatally and distinguish it from unknown languages, regardless of whether they are prosodically similar or more distant. Furthermore, babies are sensitive in this same initial state to larger prosodic units (i.e., syllables, theta band) and not to smaller ones (i.e., phonemes, gamma band), since they are the units which are exposed in the language they have heard prenatally. The effect of linguistic experience before birth turns out to be a determining factor in language processing and acquisition already during the first days after birth.

“This study leaves the door open to consider other questions that would help complete the information we have about how the brain develops and processes language, for example, analyzing tonal languages ​​in which intonation patterns imply a change in meaning that does not It occurs in non-tonal languages ​​such as the three considered in this study. Furthermore, it remains to be investigated whether the facilitation effects described here for language can also extend to other domains (for example, music),” concludes the expert.

Previous articleCan they see what you do over Wi-Fi? This you should know
Next articleHow to create your own ChatGPT