Memory and the Computational Brain: Why Cognitive Science will Transform Neuroscience / Edition 1

Memory and the Computational Brain: Why Cognitive Science will Transform Neuroscience / Edition 1

by C. R. Gallistel, Adam Philip King
     
 

View All Available Formats & Editions

ISBN-10: 1405122870

ISBN-13: 9781405122870

Pub. Date: 05/11/2009

Publisher: Wiley

Memory and the Computational Brain spans the fields of cognitive science, computer science, psychology, ethology, neuroscience, and molecular biology, to suggest new perspectives on the way we consider learning mechanisms in the brain.

Gallistel and King propose that the architecture of the brain is structured precisely for learning and for memory, and that the

Overview

Memory and the Computational Brain spans the fields of cognitive science, computer science, psychology, ethology, neuroscience, and molecular biology, to suggest new perspectives on the way we consider learning mechanisms in the brain.

Gallistel and King propose that the architecture of the brain is structured precisely for learning and for memory, and that the concept of an addressable read/write memory mechanisms should be integrated into the foundations of neuroscience. They argue that the field of neuroscience can and should benefit from the recent advances of cognitive science and the development of information theory over the recent decades. Based on three lectures given by Randy Gallistel in the Prestigious Blackwell/Maryland Lectures in Language and Cognition, the text has been significantly revised and expanded with numerous interdisciplinary examples and models, and reflects recent research to make it essential reading for both students and those working in the field.

Product Details

ISBN-13:
9781405122870
Publisher:
Wiley
Publication date:
05/11/2009
Series:
Blackwell/Maryland Lectures in Language and Cognition Series, #3
Pages:
336
Product dimensions:
6.70(w) x 9.80(h) x 1.00(d)

Table of Contents

Preface viii

1 Information 1

Shannon's Theory of Communication 2

Measuring Information 7

Efficient Coding 16

Information and the Brain 20

Digital and Analog Signals 24

Appendix: The Information Content of Rare Versus Common Events and Signals 25

2 Bayesian Updating 27

Bayes' Theorem and Our Intuitions about Evidence 30

Using Bayes' Rule 32

Summary 41

3 Functions 43

Functions of One Argument 43

Composition and Decomposition of Functions 46

Functions of More than One Argument 48

The Limits to Functional Decomposition 49

Functions Can Map to Multi-Part Outputs 49

Mapping to Multiple-Element Outputs Does Not Increase Expressive Power 50

Defining Particular Functions 51

Summary: Physical/Neurobiological Implications of Facts about Functions 53

4 Representations 55

Some Simple Examples 56

Notation 59

The Algebraic Representation of Geometry 64

5 Symbols 72

Physical Properties of Good Symbols 72

Symbol Taxonomy 79

Summary 82

6 Procedures 85

Algorithms 85

Procedures, Computation, and Symbols 87

Coding and Procedures 89

Two Senses of Knowing 100

A Geometric Example 101

7 Computation 104

Formalizing Procedures 105

The Turing Machine 107

Turing Machine for the Successor Function 110

Turning Machines for fis_even 111

Turing Machines for f+ 115

Minimal Memory Structure 121

General Purpose Computer 122

Summary 124

8 Architectures 126

One-Dimensional Look-Up Tables (If-Then Implementation) 128

Adding State Memory: Finite-State Machines 131

Adding Register Memory 137

Summary 144

9 Data Structures 149

Finding Information in Memory 151

An Illustrative Example 160

Procedures and the Coding ofData Structures 165

The Structure of the Read-Only Biological Memory 167

10 Computing with Neurons 170

Transducers and Conductors 171

Synapses and the Logic Gates 172

The Slowness of It All 173

The Time-Scale Problem 174

Synaptic Plasticity 175

Recurrent Loops in Which Activity Reverberates 183

11 The Nature of Learning 187

Learning As Rewiring 187

Synaptic Plasticity and the Associative Theory of Learning 189

Why Associations are Not Symbols 191

Distributed Coding 192

Learning As the Extraction and Preservation of Useful Information 196

Updating an Estimate of One's Location 198

12 Learning Time and Space 207

Computational Accessibility 207

Learning the Time of Day 208

Learning Durations 211

Episodic Memory 213

13 The Modularity of Learning 218

Example 1 Path Integration 219

Example 2 Learning the Solar Ephemeris 220

Example 3 "Associative" Learning 226

Summary 241

14 Dead Reckoning in a Neural Network 242

Reverberating Circuits as Read/Write Memory Mechanisms 245

Implementing Combinatorial Operations by Table-Look-Up 250

The Full Model 251

The Ontogeny of the Connections? 252

How Realistic Is the Model? 254

Lessons to Be Drawn 258

Summary 265

15 Neural Models of Interval Timing 266

Timing an Interval on First Encounter 266

Dworkin's Paradox 268

Neurally Inspired Models 269

The Deeper Problems 276

16 The Molecular Basis of Memory 278

The Need to Separate Theory of Memory from Theory of Learning 278

The Coding Question 279

A Cautionary Tale 281

Why Not Synaptic Conductance? 282

A Molecular or Sub-Molecular Mechanism? 283

Bringing the Data to the Computational Machinery 283

Is It Universal? 286

References 288

Glossary 299

Index 312

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >