The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.
The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency
214
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency
214Paperback
Product Details
| ISBN-13: | 9780993367922 |
|---|---|
| Publisher: | Tutorial Introductions |
| Publication date: | 05/15/2018 |
| Series: | Tutorial Introductions |
| Pages: | 214 |
| Product dimensions: | 6.00(w) x 9.00(h) x 0.45(d) |