Graphical Models: Foundations of Neural Computation

Graphical Models: Foundations of Neural Computation

5.0 1
by Michael I. Jordan
     
 

ISBN-10: 0262600420

ISBN-13: 9780262600422

Pub. Date: 10/12/2001

Publisher: MIT Press

Graphical models use graphs to represent and manipulate joint probability distributions. They have their roots in artificial intelligence, statistics, and neural networks. The clean mathematical formalism of the graphical models framework makes it possible to understand a wide variety of network-based approaches to computation, and in particular to understand many

…  See more details below

Overview

Graphical models use graphs to represent and manipulate joint probability distributions. They have their roots in artificial intelligence, statistics, and neural networks. The clean mathematical formalism of the graphical models framework makes it possible to understand a wide variety of network-based approaches to computation, and in particular to understand many neural network algorithms and architectures as instances of a broader probabilistic methodology. It also makes it possible to identify novel features of neural network algorithms and architectures and to extend them to more general graphical models.This book exemplifies the interplay between the general formal framework of graphical models and the exploration of new algorithms and architectures. The selections range from foundational papers of historical importance to results at the cutting edge of research.Contributors H. Attias, C. M. Bishop, B. J. Frey, Z. Ghahramani, D.

Heckerman, G. E. Hinton, R. Hofmann, R. A. Jacobs, Michael I. Jordan, H. J. Kappen,A. Krogh, R. Neal, S. K. Riis, F. B. Rodríguez, L. K. Saul, Terrence J. Sejnowski,P. Smyth, M. E. Tipping, V. Tresp, Y. Weiss.

Read More

Product Details

ISBN-13:
9780262600422
Publisher:
MIT Press
Publication date:
10/12/2001
Series:
Computational Neuroscience Series
Edition description:
New Edition
Pages:
435
Product dimensions:
6.00(w) x 9.00(h) x 1.00(d)
Age Range:
18 Years

Table of Contents

Series Forewordvii
Sourcesix
Introductionxi
1Probabilistic Independence Networks for Hidden Markov Probability Models1
2Learning and Relearning in Boltzmann Machines45
3Learning in Boltzmann Trees77
4Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space89
5Attractor Dynamics in Feedforward Neural Networks97
6Efficient Learning in Boltzmann Machines Using Linear Response Theory121
7Asymmetric Parallel Boltzmann Machines Are Belief Networks141
8Variational Learning in Nonlinear Gaussian Belief Networks145
9Mixtures of Probabilistic Principal Component Analyzers167
10Independent Factor Analysis207
11Hierarchical Mixtures of Experts and the EM Algorithm257
12Hidden Neural Networks291
13Variational Learning for Switching State-Space Models315
14Nonlinear Time-Series Prediction with Missing and Noisy Data349
15Correctness of Local Probability Propagation in Graphical Models with Loops367
Index409

Read More

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >