Kernel Adaptive Filtering: A Comprehensive Introduction [NOOK Book]

Overview

Online learning from a signal processing perspective

There is increased interest in kernel learning algorithms in neural networks and a growing need for nonlinear adaptive algorithms in advanced signal processing, communications, and controls. Kernel Adaptive Filtering is the first book to present a comprehensive, unifying introduction to online learning algorithms in reproducing kernel Hilbert spaces. Based on research being conducted in the Computational Neuro-Engineering ...

See more details below
Kernel Adaptive Filtering: A Comprehensive Introduction

Available on NOOK devices and apps  
  • NOOK Devices
  • Samsung Galaxy Tab 4 NOOK 7.0
  • Samsung Galaxy Tab 4 NOOK 10.1
  • NOOK HD Tablet
  • NOOK HD+ Tablet
  • NOOK eReaders
  • NOOK Color
  • NOOK Tablet
  • Tablet/Phone
  • NOOK for Windows 8 Tablet
  • NOOK for iOS
  • NOOK for Android
  • NOOK Kids for iPad
  • PC/Mac
  • NOOK for Windows 8
  • NOOK for PC
  • NOOK for Mac
  • NOOK for Web

Want a NOOK? Explore Now

NOOK Book (eBook)
$64.99
BN.com price
(Save 42%)$114.00 List Price
Note: This NOOK Book can be purchased in bulk. Please email us for more information.

Overview

Online learning from a signal processing perspective

There is increased interest in kernel learning algorithms in neural networks and a growing need for nonlinear adaptive algorithms in advanced signal processing, communications, and controls. Kernel Adaptive Filtering is the first book to present a comprehensive, unifying introduction to online learning algorithms in reproducing kernel Hilbert spaces. Based on research being conducted in the Computational Neuro-Engineering Laboratory at the University of Florida and in the Cognitive Systems Laboratory at McMaster University, Ontario, Canada, this unique resource elevates the adaptive filtering theory to a new level, presenting a new design methodology of nonlinear adaptive filters.

  • Covers the kernel least mean squares algorithm, kernel affine projection algorithms, the kernel recursive least squares algorithm, the theory of Gaussian process regression, and the extended kernel recursive least squares algorithm

  • Presents a powerful model-selection method called maximum marginal likelihood

  • Addresses the principal bottleneck of kernel adaptive filters—their growing structure

  • Features twelve computer-oriented experiments to reinforce the concepts, with MATLAB codes downloadable from the authors' Web site

  • Concludes each chapter with a summary of the state of the art and potential future directions for original research

Kernel Adaptive Filtering is ideal for engineers, computer scientists, and graduate students interested in nonlinear adaptive systems for online applications (applications where the data stream arrives one sample at a time and incremental optimal solutions are desirable). It is also a useful guide for those who look for nonlinear adaptive filtering methodologies to solve practical problems.

Read More Show Less

Product Details

Meet the Author

Weifeng Liu, PhD, is a senior engineer of the Demand Forecasting Team at Amazon.com Inc. His research interests include kernel adaptive filtering, online active learning, and solving real-life large-scale data mining problems.

José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering at the University of Florida, Gainesville, where he teaches advanced signal processing and artificial neural networks modeling. He is BellSouth Professor and founder and Director of the University of Florida Computational Neuro-Engineering Laboratory.

Simon Haykin is Distinguished University Professor at McMaster University, Canada.He is world-renowned for his contributions to adaptive filtering applied to radar and communications. Haykin's current research passion is focused on cognitive dynamic systems, including applications on cognitive radio and cognitive radar.

Read More Show Less

Table of Contents

PREFACE.

ACKNOWLEDGMENTS.

NOTATION.

ABBREVIATIONS AND SYMBOLS.

1 BACKGROUND AND PREVIEW.

1.1 Supervised, Sequential, and Active Learning.

1.2 Linear Adaptive Filters.

1.3 Nonlinear Adaptive Filters.

1.4 Reproducing Kernel Hilbert Spaces.

1.5 Kernel Adaptive Filters.

1.6 Summarizing Remarks.

Endnotes.

2 KERNEL LEAST-MEAN-SQUARE ALGORITHM.

2.1 Least-Mean-Square Algorithm.

2.2 Kernel Least-Mean-Square Algorithm.

2.3 Kernel and Parameter Selection.

2.4 Step-Size Parameter.

2.5 Novelty Criterion.

2.6 Self-Regularization Property of KLMS.

2.7 Leaky Kernel Least-Mean-Square Algorithm.

2.8 Normalized Kernel Least-Mean-Square Algorithm.

2.9 Kernel ADALINE.

2.10 Resource Allocating Networks.

2.11 Computer Experiments.

2.12 Conclusion.

Endnotes.

3 KERNEL AFFINE PROJECTION ALGORITHMS.

3.1 Affine Projection Algorithms.

3.2 Kernel Affine Projection Algorithms.

3.3 Error Reusing.

3.4 Sliding Window Gram Matrix Inversion.

3.5 Taxonomy for Related Algorithms.

3.6 Computer Experiments.

3.7 Conclusion.

Endnotes.

4 KERNEL RECURSIVE LEAST-SQUARES ALGORITHM.

4.1 Recursive Least-Squares Algorithm.

4.2 Exponentially Weighted Recursive Least-Squares Algorithm.

4.3 Kernel Recursive Least-Squares Algorithm.

4.4 Approximate Linear Dependency.

4.5 Exponentially Weighted Kernel Recursive Least-Squares Algorithm.

4.6 Gaussian Processes for Linear Regression.

4.7 Gaussian Processes for Nonlinear Regression.

4.8 Bayesian Model Selection.

4.9 Computer Experiments.

4.10 Conclusion.

Endnotes.

5 EXTENDED KERNEL RECURSIVE LEAST-SQUARES ALGORITHM.

5.1 Extended Recursive Least Squares Algorithm.

5.2 Exponentially Weighted Extended Recursive Least Squares Algorithm.

5.3 Extended Kernel Recursive Least Squares Algorithm.

5.4 EX-KRLS for Tracking Models.

5.5 EX-KRLS with Finite Rank Assumption.

5.6 Computer Experiments.

5.7 Conclusion.

Endnotes.

6 DESIGNING SPARSE KERNEL ADAPTIVE FILTERS.

6.1 Definition of Surprise.

6.2 A Review of Gaussian Process Regression.

6.3 Computing Surprise.

6.4 Kernel Recursive Least Squares with Surprise Criterion.

6.5 Kernel Least Mean Square with Surprise Criterion.

6.6 Kernel Affine Projection Algorithms with Surprise Criterion.

6.7 Computer Experiments.

6.8 Conclusion.

Endnotes.

EPILOGUE.

APPENDIX.

A MATHEMATICAL BACKGROUND.

A.1 Singular Value Decomposition.

A.2 Positive-Definite Matrix.

A.3 Eigenvalue Decomposition.

A.4 Schur Complement.

A.5 Block Matrix Inverse.

A.6 Matrix Inversion Lemma.

A.7 Joint, Marginal, and Conditional Probability.

A.8 Normal Distribution.

A.9 Gradient Descent.

A.10 Newton's Method.

B. APPROXIMATE LINEAR DEPENDENCY AND SYSTEM STABILITY.

REFERENCES.

INDEX.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing 1 Customer Reviews
  • Anonymous

    Posted June 5, 2010

    very good book for researchers and graduate students

    This is a very technical book meant for researchers and graduate students who are interested in machine learning and signal processing.

    Very good textbook and excellent reference book in this field.

    Was this review helpful? Yes  No   Report this review
Sort by: Showing 1 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)