Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency

The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.

1128778581
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency

The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.

34.95 In Stock
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency

Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency

by James V Stone
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency

Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency

by James V Stone

Paperback

$34.95 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.


Product Details

ISBN-13: 9780993367922
Publisher: Tutorial Introductions
Publication date: 05/15/2018
Series: Tutorial Introductions
Pages: 214
Product dimensions: 6.00(w) x 9.00(h) x 0.45(d)

About the Author

Dr James Stone is an Honorary Reader in Vision and Computational Neuroscience at the University of Sheffield, England.

Table of Contents

Principles of Neural Information Theory

1. In the Light of Evolution

1.1. Introduction

1.2. All That We See

1.3. In the Light of Evolution

1.4. In Search of General Principles

1.5. Information Theory and Biology

1.6. An Overview of Chapters

2. Information Theory

2.1. Introduction.

2.2. Finding a Route, Bit by Bit

2.3. Information and Entropy

2.4. Maximum Entropy Distributions

2.5. Channel Capacity

2.6. Mutual Information

2.7. The Gaussian Channel

2.8. Fourier Analysis

2.9. Summary

3. Measuring Neural Information

3.1. Introduction

3.2. The Neuron

3.3. Why Spikes?

3.4. Neural Information

3.5. Gaussian Firing Rates

3.6. Information About What?

3.7. DoesTiming Precision Matter?

3.8. Rate Codes and Timing Codes

3.9. Summary



4. Pricing Neural Information

4.1. Introduction

4.2. The Efficiency-Rate Trade Off

4.3. Paying With Spikes

4.4. Paying With Hardware

4.5. Paying With Power

4.6. Optimal Axon Diameter

4.7. Optimal Distribution of Axon Diameters

4.8. Axon Diameter and Spike Speed

4.9. Optimal Mean Firing Rate

4.10.Optimal Distribution of Firing Rates

4.11.Optimal Synaptic Conductance

4.12. Summary

5. Encoding Colour

5.1. Introduction

5.2. The Eye

5.3. How Aftereffects Occur

5.4. The Problem With Colour

5.5. A Neural Encoding Strategy

5.6. Encoding Colour

5.7. Why Aftereffects Occur

5.8. Measuring Mutual Information

5.9. Maximising Mutual Information

5.10. Principal Component Analysis

5.11. PCA and Mutual Information

5.12.Evidence for Efficiency

5.13. Summary

6. Encoding Time

6.1. Introduction

6.2. Linear Models

6.3. Neurons and Wine Glasses

6.4. The LNP Model

6.5. Estimating LNP Parameters

6.6. The Predictive Coding Model

6.7. Estimating Predictive Coding Parameters

6.8. Evidence for Predictive Coding

6.9. Summary

7. Encoding Space

7.1. Introduction

7.2. SpatialFrequency

7.3. Do Ganglion Cells Decorrelate Images?

7.4. Optimal Receptive Fields: Overview

7.5. Receptive Fields and Information

7.6. Measuring Mutual Information

7.7. Maximising Mutual Information

7.8. van Hateren’s Model

7.9. Predictive Coding of Images

7.10. Evidence For Predictive Coding

7.11. Is Receptive Field Spacing Optimal?

7.12. Summary

8. Encoding Visual Contrast

8.1. Introduction

8.2. The Compound Eye

8.3. Not Wasting Capacity

8.4. Measuring the Eye’s Response.

8.5. Maximum Entropy Encoding

8.6. Evidence For Maximum Entropy Coding

8.7. Summary

9. The Neural Rubicon

9.1. Introduction.

9.2. The Darwinian Cost of Efficiency

9.3. Crossing the Neural Rubicon Further Reading

Appendices

A. Glossary

B. Mathematical Symbols

C. Correlation and Independence

D. A Vector Matrix Tutorial

E. Neural Information Methods

F. Key Equations

References

Index

From the B&N Reads Blog

Customer Reviews