ISBN-10:
1405128070
ISBN-13:
9781405128070
Pub. Date:
04/28/2005
Publisher:
Wiley
Connectionism: A Hands-on Approach / Edition 1

Connectionism: A Hands-on Approach / Edition 1

by Michael R. W. Dawson

Paperback

View All Available Formats & Editions
Current price is , Original price is $61.95. You
Select a Purchase Option
  • purchase options
    $50.48 $61.95 Save 19% Current price is $50.48, Original price is $61.95. You Save 19%.
  • purchase options
    $34.41 $61.95 Save 44% Current price is $34.41, Original price is $61.95. You Save 44%.
    icon-error
    Note: Access code and/or supplemental material are not guaranteed to be included with textbook rental or used textbook.
  • purchase options

Product Details

ISBN-13: 9781405128070
Publisher: Wiley
Publication date: 04/28/2005
Pages: 208
Product dimensions: 6.70(w) x 9.70(h) x 0.60(d)

About the Author

Michael R. W. Dawson is a member of the Department of Psychology and the Biological Computation Project at the University of Alberta, Canada. He is the author of Understanding Cognitive Science (Blackwell , 1998) and Minds and Machines (Blackwell, 2004).

Table of Contents

1. Hands-on Connectionism.

1.1 Connectionism In Principle And In Practice.

1.2 The Organization Of This Book.

2. The Distributed Associative Memory.

2.1 The Paired Associates Task.

2.2 The Standard Pattern Associator.

2.3 Exploring The Distributed associative memory.

3. The James Program.

3.1 Introduction.

3.2 Installing The Program.

3.3 Teaching A Distributed Memory.

3.4 Testing What The Memory Has Learned.

3.5 Using The Program.

4. Introducing Hebb Learning.

4.1 Overview Of The Exercises.

4.2 Hebb Learning Of Basis Vectors.

4.3 Hebb Learning Of Orthonormal, Non-Basis Vectors.

5. Limitations Of Hebb Learning.

5.1 Introduction.

5.2 The Effect Of Repetition.

5.3 The Effect Of Correlation.

6. Introducing The Delta Rule.

6.1 Introduction.

6.2 The Delta Rule.

6.3 The Delta Rule And The Effect Of Repetition.

6.4 The Delta Rule And The Effect Of Correlation.

7. Distributed Networks And Human Memory.

7.1 Background On The Paired Associate Paradigm.

7.2 The Effect Of Similarity On The Distributed AssociativeMemory.

8. Limitations Of Delta Rule Learning.

8.1 Introduction.

8.2 The Delta Rule And Linear Dependency.

9. The Perceptron.

9.1 Introduction.

9.2 The Limits Of Distributed Associative Memories, AndBeyond.

9.3 Properties Of The Perceptron.

9.4 What Comes Next.

10. The Rosenblatt Program.

10.1 Introduction.

10.2 Installing The Program.

10.3 Training A Perceptron.

10.4 Testing What The Memory Has Learned.

11. Perceptrons And Logic Gates.

11.1 Introduction.

11.2 Boolean Algebra.

11.3 Perceptrons And Two-Valued Algebra.

12. Performing More Logic With Perceptrons.

12.1 Two-Valued Algebra And Pattern Spaces.

12.2 Perceptrons And Linear Separability.

12.3 Appendix Concerning The DawsonJots Font.

13. Value Units And Linear Nonseparability.

13.1 Linear Separability And Its Implications.

13.2 Value Units And The Exclusive-Or Relation.

13.3 Value Units And Connectedness.

14. Network By Problem Type Interactions.

14.1 All Networks Were Not Created Equally.

14.2 Value Units And The Two-Valued Algebra.

15. Perceptrons And Generalization.

15.1 Background.

15.2 Generalization And Savings For The 9-Majority Problem.

16. Animal Learning Theory And Perceptrons.

16.1 Discrimination Learning.

16.2 Linearly Separable Versions Of Patterning.

17. The Multilayer Perceptron.

17.1 Creating Sequences Of Logical Operations.

17.2 Multilayer Perceptrons And The Credit AssignmentProblem.

17.3 The Implications Of The Generalized Delta Rule.

18. The Rumelhart Program.

18.1 Introduction.

18.2 Installing The Program.

18.3 Training A Multilayer Perceptron.

18.4 Testing What The Network Has Learned.

19. Beyond The Perceptron’s Limits.

19.1 Introduction.

19.2 The Generalized Delta Rule And Exclusive-Or.

20. Symmetry As A Second Case Study.

20.1 Background.

20.2 Solving Symmetry Problems With Multilayer Perceptrons.

21. How Many Hidden Units?.

21.1 Background.

21.2 How Many Hidden Value Units Are Required For 5-BitParity?.

22. Scaling Up With The Parity Problem.

22.1 Overview Of The Exercises.

22.2 Background.

22.3 Exploring The Parity Problem.

23. Selectionism And Parity.

23.1 Background.

23.2 From Connectionism To Selectionism.

24. Interpreting A Small Network.

24.1 Background.

24.2 A Small Network.

24.3 Interpreting This Small Network.

25. Interpreting Networks Of Value Units.

25.1 Background.

25.2 Banding In The First Monks Problem.

25.3 Definite Features In The First Monks Problem.

26. Interpreting Distributed Representations.

26.1 Background.

26.2 Interpreting A 5-Parity Network.

27. Creating Your Own Training Sets.

27.1 Background.

27.2 Designing And Building A Training Set.

References.

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews