BN.com Gift Guide

Mathematical Foundations of Learning Machines

Overview

>

Research in Neural Networks brings together contributions from computer science, electrical engineering, physics, statistics, cognitive science and neuroscience. By providing a clear exposition of the mathematical ideas that unify this field, Mathematical Foundations of Learning Machines offers the basis of a rigorous and integrated theory of Neural Networks.

This seminal book is a recognized classic among Neural Network researchers due to Nilsson's presentation of ...

See more details below
Available through our Marketplace sellers.
Other sellers (Paperback)
  • All (6) from $24.38   
  • New (1) from $176.82   
  • Used (5) from $24.38   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$176.82
Seller since 2008

Feedback rating:

(196)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New

Ships from: Chicago, IL

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
Sending request ...

Overview

>

Research in Neural Networks brings together contributions from computer science, electrical engineering, physics, statistics, cognitive science and neuroscience. By providing a clear exposition of the mathematical ideas that unify this field, Mathematical Foundations of Learning Machines offers the basis of a rigorous and integrated theory of Neural Networks.

This seminal book is a recognized classic among Neural Network researchers due to Nilsson's presentation of intuitive geometric and statistical theories. Recent developments in Neural Networks and Artificial Intelligence underscore the importance of the strong theoretical basis for research in these areas. Many of the issues raised in this book still stand as challenges to current efforts, giving new relevance to the importance of proper formulation, analysis, experimentation, and then reformulation.

Included in this volume are discussions of special relevance to learning rates, nonparametric training methods, nonlinear network models, and related issues from computation, control theory and statistics. The emphasis on deterministic methods for solving classification problems is an excellent starting point for readers interested in the probabilistic studies associated with Neural Networks research. Anyone interested in the foundations of Neural Networks and learning in parallel distributed processing systems will find this a valuable book.

Read More Show Less

Editorial Reviews

Booknews
Neural networks research is unified by contributions from computer science, electrical engineering, physics, statistics, cognitive science and neuroscience. Author Nilsson is recognized for his presentation of intuitive geometric and statistical theories. Annotation c. Book News, Inc., Portland, OR (booknews.com)
Read More Show Less

Product Details

  • ISBN-13: 9781558601239
  • Publisher: Elsevier Science & Technology Books
  • Publication date: 3/7/1991
  • Pages: 156
  • Product dimensions: 6.04 (w) x 9.00 (h) x 0.43 (d)

Table of Contents

The Mathematical Foundations of Learning Machines
by Nils J. Nilsson
Introduction
Preface
1 Trainable Pattern Classifiers
1.1 Machine classification of data
1.2 The basic model
1.3 The problem of what to measure
1.4 Decision surfaces in pattern space
1.5 Discriminant functions
1.6 The selection of discriminant functions
1.7 Training methods
1.8 Summary of book by chapters
1.9 Bibliographical and historical remarks
References

2 Some Important Discriminant Functions: Their Properties and Their Implementations
2.1 Families of discriminant functions
2.2 Linear discriminant functions
2.3 Minimum-distance classifiers
2.4 The decision surfaces of linear machines
2.5 Linear classifications of patterns
2.6 The threshold logic unit (TLU)
2.7 Piecewise linear discriminant functions
2.8 Quadric discriminant functions
2.9 Quadric decision surfaces
2.10 Implementation of quadric discriminant functions
2.11 F functions
2.12 The utility of F functions for classifying patterns
2.13 The number of linear dichotomies of N points of d dimensions
2.14 The effects of constraints
2.15 The number of F function dichotomies
2.16 Machine capacity
2.17 Bibliographical and historical remarks
References

3 Parametric Training Methods
3.1 Probabilistic pattern sets
3.2 Discriminant functions basedd on decision theory
3.3 Likelihoods
3.4 A special loss function
3.5 An example
3.6 The bivariate normal probability-density function
3.7 The multivariate normal patterns
3.8 The optimum classifier for normal patterns
3.9 Some special cases involving identical covariance matrices
3.10 Training with normal pattern sets
3.11 Learning the mean vector of normal patterns
3.12 Bibliographical and historical remards
References

4 Some Nonparametric Training Methods for F Machines
4.1 Nonparametric training of a TLU
4.2 Weight space
4.3 TLU training procedures
4.4 A numerical example of error-correction training
4.5 An error-correction training procedure for R > 2
4.6 Applications to F machines
4.7 Bibliographical and historical remarks
References

5 Training Theorems
5.1 The fundamental training theorem
5.2 Notation
5.3 Proof 1
5.4 Proof 2
5.5 A training theorem for R-category linear machines
5.6 A related training theorem fo rthe care R = 2
5.7 Bibliographical and historical remarks
References

6 Layered Machines
6.1 Layered networks of TLUs
6.2 Committee machines
6.3 A training procedure for committee machines
6.4 An example
6.5 Transformation properties of alyered machines
6.6 A sufficient condition for image-space linear separability
6.7 Derivation of a discriminant function for a layered machine
6.8 Bibliographical and historical remarks
References

7 Piecewise Linear Machines
7.1 Multimodal pattern-classifying tasks
7.2 Training PWL machines
7.3 A disadvantage of the error-correction training methods
7.4 A nonparametric decision procedure
7.5 Nonparametric decisions based on distances to modes
7.6 Mode-seeking and related training methods for PWL machines
7.7 Bibliographical and historical remarks
References

Appendix
A.1 Separation of a quadratic form into positive and negative parts
A.2 Implementation
A.3 Transformation of normal patterns
Index
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)