Neural Networks : A Comprehensive Foundation / Edition 2

Neural Networks : A Comprehensive Foundation / Edition 2

by Simon Haykin
     
 

ISBN-10: 0132733501

ISBN-13: 9780132733502

Pub. Date: 07/06/1998

Publisher: Prentice Hall

NEW TO THIS EDITION

  • NEW—New chapters now cover such areas as:
    • Support vector machines.
    • Reinforcement learning/neurodynamic programming.
    • Dynamically driven recurrent networks.
    • NEW-End—of-chapter problems revised, improved and expanded in number.

    FEATURES

    • Extensive,

Overview

NEW TO THIS EDITION

  • NEW—New chapters now cover such areas as:
    • Support vector machines.
    • Reinforcement learning/neurodynamic programming.
    • Dynamically driven recurrent networks.
    • NEW-End—of-chapter problems revised, improved and expanded in number.

    FEATURES

    • Extensive, state-of-the-art coverage exposes the reader to the many facets of neural networks and helps them appreciate the technology's capabilities and potential applications.
    • Detailed analysis of back-propagation learning and multi-layer perceptrons.
    • Explores the intricacies of the learning process—an essential component for understanding neural networks.
    • Considers recurrent networks, such as Hopfield networks, Boltzmann machines, and meanfield theory machines, as well as modular networks, temporal processing, and neurodynamics.
    • Integrates computer experiments throughout, giving the opportunity to see how neural networks are designed and perform in practice.
    • Reinforces key concepts with chapter objectives, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary.
    • Includes a detailed and extensive bibliography for easy reference.
    • Computer-oriented experiments distributed throughout the book
    • Uses Matlab SE version 5.

Product Details

ISBN-13:
9780132733502
Publisher:
Prentice Hall
Publication date:
07/06/1998
Edition description:
REV
Pages:
842
Product dimensions:
6.92(w) x 9.36(h) x 1.61(d)

Related Subjects

Table of Contents

1. Introduction.


2. Learning Processes.


3. Single-Layer Perceptrons.


4. Multilayer Perceptrons.


5. Radial-Basis Function Networks.


6. Support Vector Machines.


7. Committee Machines.


8. Principal Components Analysis.


9. Self-Organizing Maps.


10. Information-Theoretic Models.


11. Stochastic Machines & Their Approximates Rooted in Statistical Mechanics.


12. Neurodynamic Programming.


13. Temporal Processing Using Feedforward Networks.


14. Neurodynamics.


15. Dynamically Driven Recurrent Networks.


Epilogue.


Bibliography.


Index.

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >