How our brain produces language and thought, according to neuroscientists

For thousands of years, philosophers have debated the purpose of language. Plato believed it was essential to thinking. Thought “is the silent interior conversation of the soul with itself,” he wrote.

Many modern scholars have advanced similar views. Beginning in the 1960s, Noam Chomsky, a linguist at MIT, argued that we use language for reasoning and other forms of thought. “If there is a great deficit of language, there will be a great deficit of thought,” he wrote.

As an undergraduate student, Evelina Fedorenko took the class of Dr. Chomsky and heard him describe his theory. “I really liked the idea,” she recalls. But she was puzzled by the lack of evidence. “A lot of what he was saying was just stated as fact — the truth,” she said.

Dr. Fedorenko went on to become a cognitive neuroscientist at MIT, using brain scanning to investigate how the brain produces language. And after 15 years, her research has led her to a startling conclusion: We don’t need language to think.

“When you start evaluating it, you just don’t find support for this role of language in thinking,” she said.

When Dr. Fedorenko began this work in 2009, studies had found that the same brain regions required for language were also active when people reasoned or performed arithmetic.

But Dr. Fedorenko and other researchers discovered that this overlap was a mirage. Part of the problems with the early results was that the scanners were relatively crude. The scientists made the most of their fuzzy scans by combining the results from all their volunteers, creating an overall average of brain activity.

In her research, Dr. Fedorenko used more powerful scanners and performed more tests on each volunteer. These steps allowed her and her colleagues to collect enough data from each person to create a detailed picture of an individual brain.

The scientists then conducted studies to identify the brain circuits involved in language tasks, such as retrieving words from memory and following grammar rules. In a typical experiment, volunteers read gibberish, followed by real sentences. The scientists discovered certain brain regions that became active only when the volunteers processed the actual language.

Each volunteer had a language network—a set of regions that are activated during language tasks. “It’s very stable,” said Dr. Fedorenko. “If I scan you today, and 10 or 15 years from now, it will be in the same place.”

The researchers then scanned the same people as they performed different types of thinking, such as solving a puzzle. “Other regions in the brain are working really hard when you’re doing all these forms of thinking,” she said. But the language networks remained silent. “It became clear that none of these things seem to involve language circuits,” she said.

In a paper published Wednesday in Nature, Dr. Fedorenko and her colleagues argued that studies of people with brain injuries show the same conclusion.

Strokes and other forms of brain damage can wipe out the language network, leaving people struggling to process words and grammar, a condition known as aphasia. But scientists have found that people can still do algebra and play chess even with aphasia. In experiments, people with aphasia can look at two numbers—123 and 321, say—and realize that, using the same pattern, 456 must be followed by 654.

If language is not essential to thinking, then what is language for? Communication, Dr. Fedorenko and her colleagues argue. Dr. Chomsky and other scholars have rejected this idea, noting the ambiguity of words and the difficulty of expressing our intuitions out loud. “The system is not well designed in many functional respects,” Dr. Chomsky.

But major studies have suggested that languages ​​are optimized to transfer information clearly and efficiently.

In one study, researchers found that frequently used words are shorter, making languages ​​easier to learn and speeding up the flow of information. In another study, researchers investigating 37 languages ​​found that grammar rules place words close together so that their combined meaning is easier to understand.

Kyle Mahowald, a linguist at the University of Texas at Austin who was not involved in the new work, said the separation of thought and language may help explain why artificial intelligence systems like ChatGPT are so good at some tasks and so bad in others.

Computer scientists train these programs on large amounts of text, discovering rules for how words go together. Dr. Mahowald suspects that these programs are beginning to mimic the language network in the human brain—but fail to reason.

“It is possible to have very fluent grammatical text that may or may not have a coherent underlying thought,” said Dr. Mahowald.

But Guy Dove, a philosopher at the University of Louisville, thought that Dr. Fedorenko and her colleagues were going too far in banishing language from thought—especially complex thought. “When we’re thinking about democracy, we can try talking about democracy,” he said. “You don’t need language to have thoughts, but it can be an improvement.”

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top