Language has existed for about two million years, but it is only in the last 150 that we have begun to understand what happens when we listen and speak.
Language is uniquely human. But have you ever stopped to think what happens in your brain when someone speaks to you? How you process words, and how you form a response?
Language processing is a vastly complicated affair that, in truth, scientists are light years from fully understanding. First, let’s look at the discovery of the two epicenters of language processing.
Broca and Wernicke
In 1861, neurosurgeon Paul Broca found that patients with speech defects had injuries on the left hemisphere of the brain. The idea that language is processed in the left hemisphere was born, and the area he discovered was named Broca’s area, thought for years to be solely responsible for speech production.
A decade later, neurologist Carl Wernicke discovered an area to the rear of the left hemisphere, now called Wernicke’s area that he linked with processing the words we hear. Broca’s area and Wernicke’s area are linked by a bundle of nerve fibers called the arcuate fasciculus.
From these discoveries came the classic model of language processing: Sounds are processed by the auditory cortex, go then to Wernicke’s area to be understood, travel along the arcuate fasciculus to Broca’s area, and the motor cortex, finally resulting in speech- And all this is done in the left hemisphere. Straightforward.
Too straightforward. This model is a valid platform from which to explore newer findings, and the areas involved are indeed vital for language processing. But rather than a step-to-step, it is now known that all the areas of the brain are activated simultaneously during language processing. Plus, although each area has its own speciality, they all pitch in and help out with different language processes.
As no other animal has a speech system like ours, scientists were limited to studying the effects of disease and damage on language production for decades. Since then, and largely in the last 20 years, an avalanche of technological advancements has added color and new lines to the crude Broca-Wernicke sketch.
Here is a recent sample of the role of technology in helping us to decipher how the brain processes language. In 2008, researchers used diffusion tensor imaging (DTI), a non-invasive imaging technique, and found that the arcuate fasciculus in humans is much larger and projects further than it does in monkeys, leading them to hypothesize that we may have language while other creatures don’t because our brains are simply better connected.
In 2009, San Diego researchers used Intra-Cranial Electrophysiology (ICE), where electrodes are placed on the brain, to show that Broca’s area is involved in both hearing and producing speech, not just production as had been previously thought.
In 2011, New York University researchers used magnetoencephalography, (MEG) where coils are attached to the head to measure magnetic fields produced by the brain’s electrical activity. They found that while processing simple two word phrases, neither Broca’s or Wernicke’s areas are involved. Rather, the left anterior temporal lobe (LATL) and the ventromedial prefrontal cortex (vmPFC) showed increased activity.
And that is just a minute slice of the tireless research currently being undertaken. Despite all the uncertainties and changing theories, there have been some irrefutable discoveries that tell us how different people process language in different ways.
Left Hemisphere Dominance?
Numerous functional magnetic resonance imaging (fMRI) studies have shown that the left hemisphere is dominant for language processing in around 96% of people, not 100% as Broca originally thought. Still, he based this theory on the examination of just eight patients, which would be unheard of today.
In the remaining 4%, the right hemisphere is dominant. There is no known reason for why one particular hemisphere dominates language processing. Interestingly, a substantial number of the 4% are left-handed. However, some 70% of lefthanders process language in the left hemisphere. So while there is a link, handedness does not seem to dictate which hemisphere dominates.
One Hemisphere Only?
A second long-held belief about how we process language is that it takes place in one hemisphere only. However, a review of 23 neuroimaging studies by the University of Leipzig in 2008 concluded that this is not the case.
It is now accepted that while the dominant hemisphere is responsible for the the nuts and bolts of language processing, the “minor” hemisphere helps interpret tone, nuance, metaphors and so on. So, if you suffered an injury to your minor hemisphere, and I said “I’m going to kill two birds with one stone,” you could think that I was setting off to commit avian murder.
Male vs. Female
We may have long suspected it, but science has proven that men and women are on different wavelengths. A 2000 study by the Indiana School of Medicine is just one of a raft of fMRI studies showing that men and women listen differently. The study found that men use the dominant hemisphere (usually left) to listen, while women use both hemispheres.
In addition, a 1997 study by the University of Sydney found that Broca’s area and Wernicke’s area, those two powerhouses of language, are significantly bigger in women than in men, by up to 20%. These differences may partly explain the statistics indicating that women are more likely to have better developed language skills than men.
Bilingual vs. Monolingual
Scientists used to think that learning a second language could only be done at the expense of the first, like there was a finite number of neurons in the brain assigned for language. Now, thanks to human research and fMRI, this belief has been debunked.