How Deeply Human Is Language?: Chomsky, the Brain, and the AI Fantasy
A leading neurolinguist explains linguistic theory and large language models—the top contenders for understanding human language—and evaluates them in the context of the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
1147890730
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
How Deeply Human Is Language?: Chomsky, the Brain, and the AI Fantasy
A leading neurolinguist explains linguistic theory and large language models—the top contenders for understanding human language—and evaluates them in the context of the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
Contemporary linguistics, founded and inspired by Noam Chomsky, seeks to understand the hallmark of our humanity—language. Linguists develop powerful tools to discover how knowledge of language is acquired and how the brain puts it to use. AI experts, using vastly different methods, create remarkable neural networks—large language models (LLMs) such as ChatGPT—said to learn and use language like us.
Chomsky called LLMs “a false promise.” AI leader Geoffrey Hinton has declared that “neural nets are much better at processing language than anything ever produced by the Chomsky School of Linguistics.”
Who is right, and how can we tell? Do we learn everything from scratch, or could some knowledge be innate? Is our brain one big network, or is it built out of modules, language being one of them?
In How Deeply Human Is Language?, Yosef Grodzinsky explains both approaches and confronts them with the reality as it emerges from the engineering, the linguistic, and the neurological record. He walks readers through vastly different methods, tools, and findings from all these fields. Aiming to find a common path forward, he describes the conflict, but also locates points of potential contact, and sketches a joint research program that may unite these communities in a common effort to understand knowledge and learning in the brain.
30.0
Pre Order
5
1
How Deeply Human Is Language?: Chomsky, the Brain, and the AI Fantasy
192
How Deeply Human Is Language?: Chomsky, the Brain, and the AI Fantasy
192
30.0
Pre Order
Product Details
| ISBN-13: | 9780262052009 |
|---|---|
| Publisher: | MIT Press |
| Publication date: | 04/21/2026 |
| Pages: | 192 |
| Product dimensions: | 6.00(w) x 9.00(h) x (d) |
About the Author
From the B&N Reads Blog