Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives / Edition 1

Information Theoretic Learning: Renyi's Entropy and Kernel Perspectives / Edition 1

by Jose C. Principe
     
 

ISBN-10: 1441915699

ISBN-13: 9781441915696

Pub. Date: 04/15/2010

Publisher: Springer New York

This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with…  See more details below

Overview

This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy. ITL quantifies the shastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesian approaches that require a much larger computational cost. This is possible because of a non-parametric estimator of Renyi’s quadratic entropy that is only a function of pairwise differences between samples. The book compares the performance of ITL algorithms with the second order counterparts in many engineering and machine learning applications.Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find in this book the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research.

Read More

Product Details

ISBN-13:
9781441915696
Publisher:
Springer New York
Publication date:
04/15/2010
Series:
Information Science and Statistics Series
Edition description:
2010
Pages:
448
Product dimensions:
6.40(w) x 9.30(h) x 1.60(d)

Table of Contents

Information theory, machine learning and reproducing kernel Hilbert spaces.- Renyi’s entropy, divergence and their nonparametric estimators.- Adaptive information filtering with error entropy and error correntropy criteria.- Algorithms for entropy and correntropy adaptation with applications to linear systems.- Nonlinear adaptive filtering with MEE, MCC and applications.- Classification with EEC, divergence measures and error bounds.- Clustering with ITL principles.- Self-organizing ITL principles for unsupervised learning.- A reproducing kernel Hilbert space framework for ITL.- Correntropy for random variables: properties, and applications in statistical inference.- Correntropy for random processes: properties, and applications in signal processing.- Appendix A: PDF estimation methods and experimental evaluation of ITL descriptors.

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >