Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons
About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac­ tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com­ bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and shastic approximation theory.
1120253747
Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons
About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac­ tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com­ bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and shastic approximation theory.
54.99 In Stock
Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons

Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons

by Adrian J. Shepherd
Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons

Second-Order Methods for Neural Networks: Fast and Reliable Training Methods for Multi-Layer Perceptrons

by Adrian J. Shepherd

Paperback(1997)

$54.99 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

About This Book This book is about training methods - in particular, fast second-order training methods - for multi-layer perceptrons (MLPs). MLPs (also known as feed-forward neural networks) are the most widely-used class of neural network. Over the past decade MLPs have achieved increasing popularity among scientists, engineers and other professionals as tools for tackling a wide variety of information processing tasks. In common with all neural networks, MLPsare trained (rather than programmed) to carryout the chosen information processing function. Unfortunately, the (traditional' method for trainingMLPs- the well-knownbackpropagation method - is notoriously slow and unreliable when applied to many prac­ tical tasks. The development of fast and reliable training algorithms for MLPsis one of the most important areas ofresearch within the entire field of neural computing. The main purpose of this book is to bring to a wider audience a range of alternative methods for training MLPs, methods which have proved orders of magnitude faster than backpropagation when applied to many training tasks. The book also addresses the well-known (local minima' problem, and explains ways in which fast training methods can be com­ bined with strategies for avoiding (or escaping from) local minima. All the methods described in this book have a strong theoretical foundation, drawing on such diverse mathematical fields as classical optimisation theory, homotopic theory and shastic approximation theory.

Product Details

ISBN-13: 9783540761006
Publisher: Springer London
Publication date: 05/29/1997
Series: Perspectives in Neural Computing
Edition description: 1997
Pages: 145
Product dimensions: 6.10(w) x 9.25(h) x 0.01(d)

Table of Contents

1 Multi-Layer Perceptron Training.- 1.1 Introduction to MLPs.- 1.2 Error Surfaces and Local Minima.- 1.3 Backpropagation.- 2 Classical Optimisation.- 2.1 Introduction to Classical Methods.- 2.2 General Numerical Considerations.- 3 Second-Order Optimisation Methods.- 3.1 Line-Search Strategies.- 3.2 Model-Trust Region Strategies.- 3.3 Multivariate Methods for General Nonlinear Optimisation.- 3.4 Special Methods for Nonlinear Least Squares.- 3.5 Comparison of Methods.- 4 Second-Order Training Methods for MLPs.- 4.1 The Calculation of Second Derivatives.- 4.2 Reducing Storage and Computational Costs.- 4.3 Second-Order On-Line Training.- 4.4 Conclusion.- 5 An Experimental Comparison of MLP Training Methods.- 5.1 Benchmark Training Tasks.- 5.2 Initial Training Conditions.- 5.3 Experimental Results.- 6 Global Optimisation.- 6.1 Introduction to Global Methods.- 6.2 Expanded Range Approximation (ERA).- 6.3 The TRUST Method.
From the B&N Reads Blog

Customer Reviews