Learning in Graphical Models

Overview

Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering
-- uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory ...

See more details below
Paperback (New Edition)
$72.18
BN.com price
(Save 3%)$75.00 List Price
Other sellers (Paperback)
  • All (8) from $55.00   
  • New (4) from $72.17   
  • Used (4) from $55.00   
Sending request ...

Overview

Graphical models, a marriage between probability theory and graph theory, provide a natural tool for dealing with two problems that occur throughout applied mathematics and engineering
-- uncertainty and complexity. In particular, they play an increasingly important role in the design and analysis of machine learning algorithms. Fundamental to the idea of a graphical model is the notion of modularity: a complex system is built by combining simpler parts. Probability theory serves as the glue whereby the parts are combined, ensuring that the system as a whole is consistent and providing ways to interface models to data. Graph theory provides both an intuitively appealing interface by which humans can model highly interacting sets of variables and a data structure that lends itself naturally to the design of efficient general-purpose algorithms.

This book presents an in-depth exploration of issues related to learning within the graphical model formalism. Four chapters are tutorial chapters -- Robert Cowell on Inference for Bayesian Networks,
David MacKay on Monte Carlo Methods, Michael I. Jordan et al. on Variational Methods, and David
Heckerman on Learning with Bayesian Networks. The remaining chapters cover a wide range of topics of current research interest.

The MIT Press

Read More Show Less

Editorial Reviews

Booknews
Collects 23 contributions on this subject at the confluence of probability theory and graph theory. Topics include: probabilistic inference using exact, variational, and Monte Carlo algorithms; Markov properties and chain graphs; stochastic dependence; Bayesian methods for parameter and structure learning; and tests for conditional independence in graphical Gaussian models. The volume arose from September 1996 proceedings of the International School on Neural Nets in Erice, Italy. Annotation c. by Book News, Inc., Portland, Or.
Read More Show Less

Product Details

Meet the Author

Michael I. Jordan is Professor of Computer Science and of Statistics at the University of
California, Berkeley, and recipient of the ACM/AAAI Allen Newell Award.
Read More Show Less

Table of Contents

Preface 1
Introduction to Inference for Bayesian Networks 9
Advanced Inference in Bayesian Networks 27
Inference in Bayesian Networks using Nested Junction Trees 51
Bucket Elimination: A Unifying Framework for Probabilistic Inference 75
An Introduction to Variational Methods for Graphical Models 105
Improving the Mean Field Approximation via the Use of Mixture Distributions 163
Introduction to Monte Carlo Methods 175
Suppressing Random Walks in Markov Chain Monte Carlo using Ordered Overrelaxation 205
Chain Graphs and Symmetric Associations 231
The Multiinformation Function as a Tool for Measuring Stochastic Dependence 261
A Tutorial on Learning with Bayesian Networks 301
A View of the EM Algorithm that Justifies Incremental, Sparse, and Other Variants 355
Latent Variable Models 371
Stochastic Algorithms for Exploratory Data Analysis: Data Clustering and Data Visualization 405
Learning Bayesian Networks with Local Structure 421
Asymptotic Model Selection for Directed Networks with Hidden Variables 461
A Hierarchical Community of Experts 479
An Information-Theoretic Analysis of Hard and Soft Assignment Methods for Clustering 495
Learning Hybrid Bayesian Networks from Data 521
A Mean Field Learning Algorithm for Unsupervised Neural Networks 541
Edge Exclusion Tests for Graphical Gaussian Models 555
Hepatitis B: A Case Study in MCMC 575
Prediction with Gaussian Processes: From Linear Regression to Linear Prediction and Beyond 599
Subject Index 623
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)