Information Loss in Deterministic Signal Processing Systems
This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss. 
These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
1133675276
Information Loss in Deterministic Signal Processing Systems
This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss. 
These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.
99.0 In Stock
Information Loss in Deterministic Signal Processing Systems

Information Loss in Deterministic Signal Processing Systems

by Bernhard C. Geiger, Gernot Kubin
Information Loss in Deterministic Signal Processing Systems

Information Loss in Deterministic Signal Processing Systems

by Bernhard C. Geiger, Gernot Kubin

eBook1st ed. 2018 (1st ed. 2018)

$99.00 

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

This book introduces readers to essential tools for the measurement and analysis of information loss in signal processing systems. Employing a new information-theoretic systems theory, the book analyzes various systems in the signal processing engineer’s toolbox: polynomials, quantizers, rectifiers, linear filters with and without quantization effects, principal components analysis, multirate systems, etc. The user benefit of signal processing is further highlighted with the concept of relevant information loss. Signal or data processing operates on the physical representation of information so that users can easily access and extract that information. However, a fundamental theorem in information theory—data processing inequality—states that deterministic processing always involves information loss. 
These measures form the basis of a new information-theoretic systems theory, which complements the currently prevailing approaches based on second-order statistics, such as the mean-squared error or error energy. This theory not only provides a deeper understanding but also extends the design space for the applied engineer with a wide range of methods rooted in information theory, adding to existing methods based on energy or quadratic representations.

Product Details

ISBN-13: 9783319595337
Publisher: Springer-Verlag New York, LLC
Publication date: 07/02/2017
Series: Understanding Complex Systems
Sold by: Barnes & Noble
Format: eBook
File size: 4 MB

Table of Contents

Introduction.- Part I: Random Variables.- Piecewise Bijective Functions and Continuous Inputs.- General Input Distributions.- Dimensionality-Reducing Functions.- Relevant Information Loss.- II. Part II: Stationary Stochastic Processes.- Discrete-Valued Processes.- Piecewise Bijective Functions and Continuous Inputs.- Dimensionality-Reducing Functions.- Relevant Information Loss Rate.- Conclusion and Outlook.
From the B&N Reads Blog

Customer Reviews