Kernel Adaptive Filtering: A Comprehensive Introduction / Edition 1

Hardcover (Print)
Rent
Rent from BN.com
$47.31
(Save 59%)
Est. Return Date: 06/22/2014
Used and New from Other Sellers
Used and New from Other Sellers
from $87.54
Usually ships in 1-2 business days
(Save 23%)
Other sellers (Hardcover)
  • All (8) from $87.54   
  • New (7) from $87.54   
  • Used (1) from $87.54   

Overview

There is increased interest in kernel learning algorithms in neural networks and a growing need for nonlinear adaptive algorithms in advanced signal processing, communications, and controls. Kernel Adaptive Filtering is the first book to present a comprehensive, unifying introduction to online learning algorithms in reproducing kernel Hilbert spaces. Based on research being conducted in the Computational Neuro-Engineering Laboratory at the University of Florida and in the Cognitive Systems Laboratory at McMaster University, Ontario, Canada, this unique resource elevates the adaptive filtering theory to a new level, presenting a new design methodology of nonlinear adaptive filters.

Covers the kernel least mean squares algorithm, kernel affine projection algorithms, the kernel recursive least squares algorithm, the theory of Gaussian process regression, and the extended kernel recursive least squares algorithm

Presents a powerful model-selection method called maximum marginal likelihood

Addresses the principal bottleneck of kernel adaptive filters-their growing structure

Features twelve computer-oriented experiments to reinforce the concepts, with MATLAB® codes downloadable from the authors' Web site

Concludes each chapter with a summary of the state of the art and potential future directions for original research

Kernel Adaptive Filtering is ideal for engineers, computer scientists, and graduate students interested in nonlinear adaptive systems for online applications (applications where the data stream arrives one sample at a time and incremental optimal solutions are desirable). It is also a useful guide for those who look for nonlinear adaptive filtering methodologies to solve practical problems.

Read More Show Less

Product Details

Meet the Author

Weifeng Liu, PhD, is a senior engineer of the Demand Forecasting Team at Amazon.com Inc. His research interests include kernel adaptive filtering, online active learning, and solving real-life large-scale data mining problems.

José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering at the University of Florida, Gainesville, where he teaches advanced signal processing and artificial neural networks modeling. He is BellSouth Professor and founder and Director of the University of Florida Computational Neuro-Engineering Laboratory.

Simon Haykin is Distinguished University Professor at McMaster University, Canada.He is world-renowned for his contributions to adaptive filtering applied to radar and communications. Haykin's current research passion is focused on cognitive dynamic systems, including applications on cognitive radio and cognitive radar.

Read More Show Less

Table of Contents

Preface xi

Acknowledgments xv

Notation xvii

Abbreviations and Symbols xix

1 Background and Preview 1

1.1 Supervised, Sequential, and Active Learning 1

1.2 Linear Adaptive Filters 3

1.3 Nonlinear Adaptive Filters 10

1.4 Reproducing Kernel Hilbert Spaces 12

1.5 Kernel Adaptive Filters 16

1.6 Summarizing Remarks 20

Endnotes 21

2 Kernel Least-Mean-Square Algorithm 27

2.1 Least-Mean-Square Algorithm 28

2.2 Kernel Least-Mean-Square Algorithm 31

2.3 Kernel and Parameter Selection 34

2.4 Step-Size Parameter 37

2.5 Novelty Criterion 38

2.6 Self-Regularization Property of KLMS 40

2.7 Leaky Kernel Least-Mean-Square Algorithm 48

2.8 Normalized Kernel Least-Mean-Square Algorithm 48

2.9 Kernel ADALINE 49

2.10 Resource Allocating Networks 53

2.11 Computer Experiments 55

2.12 Conclusion 63

Endnotes 65

3 Kernel Affine Projection Algorithms 69

3.1 Affine Projection Algorithms 70

3.2 Kernel Affine Projection Algorithms 72

3.3 Error Reusing 77

3.4 Sliding Window Gram Matrix Inversion 78

3.5 Taxonomy for Related Algorithms 78

3.6 Computer Experiments 80

3.7 Conclusion 89

Endnotes 91

4 Kernel Recursive Least-Squares Algorithm 94

4.1 Recursive Least-Squares Algorithm 94

4.2 Exponentially Weighted Recursive Least-Squares Algorithm 97

4.3 Kernel Recursive Least-Squares Algorithm 98

4.4 Approximate Linear Dependency 102

4.5 Exponentially Weighted Kernel Recursive Least-Squares Algorithm 103

4.6 Gaussian Processes for Linear Regression 105

4.7 Gaussian Processes for Nonlinear Regression 108

4.8 Bayesian Model Selection 111

4.9 Computer Experiments 114

4.10 Conclusion 119

Endnotes 120

5 Extended Kernel Recursive Least-Squares Algorithm 124

5.1 Extended Recursive Least Squares Algorithm 125

5.2 Exponentially Weighted Extended Recursive Least Squares Algorithm 128

5.3 Extended Kernel Recursive Least Squares Algorithm 129

5.4 EX-KRLS for Tracking Models 131

5.5 EX-KRLS with Finite Rank Assumption 137

5.6 Computer Experiments 141

5.7 Conclusion 150

Endnotes 151 l 6 Designing Sparse Kernel Adaptive Filters 152

6.1 Definition of Surprise 152

6.2 A Review of Gaussian Process Regression 154

6.3 Computing Surprise 156

6.4 Kernel Recursive Least Squares with Surprise Criterion 159

6.5 Kernel Least Mean Square with Surprise Criterion 160

6.6 Kernel Affine Projection Algorithms with Surprise Criterion 161

6.7 Computer Experiments 162

6.8 Conclusion 173

Endnotes 174

Epilogue 175

Appendix 177

A Mathematical Background 177

A.l Singular Value Decomposition 177

A.2 Positive-Definite Matrix 179

A.3 Eigenvalue Decomposition 179

A.4 Schur Complement 181

A.5 Block Matrix Inverse 181

A.6 Matrix Inversion Lemma 182

A.7 Joint, Marginal, and Conditional Probability 182

A.8 Normal Distribution 183

A.9 Gradient Descent 184

A.10 Newton's Method 184

B Approximate Linear Dependency and System Stability 186

References 193

Index 204

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing 1 Customer Reviews
  • Anonymous

    Posted June 5, 2010

    very good book for researchers and graduate students

    This is a very technical book meant for researchers and graduate students who are interested in machine learning and signal processing.

    Very good textbook and excellent reference book in this field.

    Was this review helpful? Yes  No   Report this review
Sort by: Showing 1 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)