Sensitivity Analysis for Neural Networks / Edition 1

Sensitivity Analysis for Neural Networks / Edition 1

ISBN-10:
3642025315
ISBN-13:
9783642025310
Pub. Date:
11/16/2009
Publisher:
Springer Berlin Heidelberg
ISBN-10:
3642025315
ISBN-13:
9783642025310
Pub. Date:
11/16/2009
Publisher:
Springer Berlin Heidelberg
Sensitivity Analysis for Neural Networks / Edition 1

Sensitivity Analysis for Neural Networks / Edition 1

Hardcover

$109.99
Current price is , Original price is $109.99. You
$109.99 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.


Overview

Artificial neural networks are used to model systems that receive inputs and produce outputs. The relationships between the inputs and outputs and the representation parameters are critical issues in the design of related engineering systems, and sensitivity analysis concerns methods for analyzing these relationships. Perturbations of neural networks are caused by machine imprecision, and they can be simulated by embedding disturbances in the original inputs or connection weights, allowing us to study the characteristics of a function under small perturbations of its parameters.

This is the first book to present a systematic description of sensitivity analysis methods for artificial neural networks. It covers sensitivity analysis of multilayer perceptron neural networks and radial basis function neural networks, two widely used models in the machine learning field. The authors examine the applications of such analysis in tasks such as feature selection, sample reduction, and network optimization. The book will be useful for engineers applying neural network sensitivity analysis to solve practical problems, and for researchers interested in foundational problems in neural networks.


Product Details

ISBN-13: 9783642025310
Publisher: Springer Berlin Heidelberg
Publication date: 11/16/2009
Series: Natural Computing Series
Edition description: 2010
Pages: 86
Product dimensions: 6.20(w) x 9.20(h) x 0.50(d)

Table of Contents

1 Introduction to Neural Networks 1

1.1 Properties of Neural Networks 3

1.2 Neural Network Learning 5

1.2.1 Supervised Learning 5

1.2.2 Unsupervised Learning 5

1.3 Perceptron 6

1.4 Adaline and Least Mean Square Algorithm 8

1.5 Multilayer Perceptron and Backpropagation Algorithm 9

1.5.1 Output Layer Learning 11

1.5.2 Hidden Layer Learning 11

1.6 Radial Basis Function Networks 12

1.7 Support Vector Machines 13

2 Principles of Sensitivity Analysis 17

2.1 Perturbations in Neural Networks 17

2.2 Neural Network Sensitivity Analysis 18

2.3 Fundamental Methods of Sensitivity Analysis 21

2.3.1 Geometrical Approach 21

2.3.2 Statistical Approach 23

2.4 Summary 24

3 Hyper-Rectangle Model 25

3.1 Hyper-Rectangle Model for Input Space of MLP 25

3.2 Sensitivity Measure of MLP 26

3.3 Discussion 27

4 Sensitivity Analysis with Parameterized Activation Function 29

4.1 Parameterized Antisymmetric Squashing Function 29

4.2 Sensitivity Measure 30

4.3 Summary 31

5 Localized Generalization Error Model 33

5.1 Introduction 33

5.2 The Localized Generalization Error Model 35

5.2.1 The Q-Neighborhood and Q-Union 36

5.2.2 The Localized Generalization Error Bound 36

5.2.3 Stochastic Sensitivity Measure for RBFNN 38

5.2.4 Characteristics of the Error Bound 40

5.2.5 Comparing Two Classifiers Using the Error Bound 42

5.3 Architecture Selection Using the Error Bound 42

5.3.1 Parameters for MC2SG 44

5.3.2 RBFNN Architecture Selection Algorithm for MC2SG 44

5.3.3 A Heuristic Method to Reduce the Computational Time for MC2SG 45

5.4 Summary 45

6 Critical Vector Learning for RBF Networks 47

6.1 Related Work 47

6.2 Construction of RBF Networks with Sensitivity Analysis 48

6.2.1 RBF Classifiers' Sensitivity to the Kernel Function Centers 49

6.2.2 Orthogonal Least Square Transform 51

6.2.3 Critical Vector Selection 52

6.3 Summary 52

7 Sensitivity Analysis of Prior Knowledge 55

7.1 KBANNs 56

7.2 Inductive Bias 58

7.3 Sensitivity Analysis and Measures 59

7.3.1 Output-Pattern Sensitivity 59

7.3.2 Output-Weight Sensitivity 60

7.3.3 Output-H Sensitivity 61

7.3.4 Euclidean Distance 61

7.4 Promoter Recognition 61

7.4.1 Data and Initial Domain Theory 62

7.4.2 Experimental Methodology 63

7.5 Discussion and Conclusion 64

8 Applications 69

8.1 Input Dimension Reduction 69

8.1.1 Sensitivity Matrix 70

8.1.2 Criteria for Pruning Inputs 70

8.2 Network Optimization 71

8.3 Selective Learning 74

8.4 Hardware Robustness 76

8.5 Measure of Nonlinearity 77

8.6 Parameter Tuning for Neocognitron 78

8.6.1 Receptive Field 79

8.6.2 Selectivity 80

8.6.3 Sensitivity Analysis of the Neocognitron 80

Bibliography 83

From the B&N Reads Blog

Customer Reviews