Analytic Information Theory: From Compression to Learning
Through information theory, problems of communication and compression can be precisely modeled, formulated, and analyzed, and this information can be transformed by means of algorithms. Also, learning can be viewed as compression with side information. Aimed at students and researchers, this book addresses data compression and redundancy within existing methods and central topics in theoretical data compression, demonstrating how to use tools from analytic combinatorics to discover and analyze precise behavior of source codes. It shows that to present better learnable or extractable information in its shortest description, one must understand what the information is, and then algorithmically extract it in its most compact form via an efficient compression algorithm. Part I covers fixed-to-variable codes such as Shannon and Huffman codes, variable-to-fixed codes such as Tunstall and Khodak codes, and variable-to-variable Khodak codes for known sources. Part II discusses universal source coding for memoryless, Markov, and renewal sources.
1143191046
Analytic Information Theory: From Compression to Learning
Through information theory, problems of communication and compression can be precisely modeled, formulated, and analyzed, and this information can be transformed by means of algorithms. Also, learning can be viewed as compression with side information. Aimed at students and researchers, this book addresses data compression and redundancy within existing methods and central topics in theoretical data compression, demonstrating how to use tools from analytic combinatorics to discover and analyze precise behavior of source codes. It shows that to present better learnable or extractable information in its shortest description, one must understand what the information is, and then algorithmically extract it in its most compact form via an efficient compression algorithm. Part I covers fixed-to-variable codes such as Shannon and Huffman codes, variable-to-fixed codes such as Tunstall and Khodak codes, and variable-to-variable Khodak codes for known sources. Part II discusses universal source coding for memoryless, Markov, and renewal sources.
155.0 In Stock
Analytic Information Theory: From Compression to Learning

Analytic Information Theory: From Compression to Learning

Analytic Information Theory: From Compression to Learning

Analytic Information Theory: From Compression to Learning

Hardcover

$155.00 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

Through information theory, problems of communication and compression can be precisely modeled, formulated, and analyzed, and this information can be transformed by means of algorithms. Also, learning can be viewed as compression with side information. Aimed at students and researchers, this book addresses data compression and redundancy within existing methods and central topics in theoretical data compression, demonstrating how to use tools from analytic combinatorics to discover and analyze precise behavior of source codes. It shows that to present better learnable or extractable information in its shortest description, one must understand what the information is, and then algorithmically extract it in its most compact form via an efficient compression algorithm. Part I covers fixed-to-variable codes such as Shannon and Huffman codes, variable-to-fixed codes such as Tunstall and Khodak codes, and variable-to-variable Khodak codes for known sources. Part II discusses universal source coding for memoryless, Markov, and renewal sources.

Product Details

ISBN-13: 9781108474443
Publisher: Cambridge University Press
Publication date: 09/07/2023
Pages: 380
Product dimensions: 7.28(w) x 10.28(h) x 1.06(d)

About the Author

Michael Drmota is Professor for Discrete Mathematics at TU Wien. His research activities range from analytic combinatorics over discrete random structures to number theory. He has published several books, including 'Random Trees' (2009), and about 200 research articles. He was President of the Austrian Mathematical Society from 2010 to 2013, and has been Corresponding Member of the Austrian Academy of Sciences since 2013.

Wojciech Szpankowski is the Saul Rosen Distinguished Professor of Computer Science at Purdue University where he teaches and conducts research in analysis of algorithms, information theory, analytic combinatorics, random structures, and machine learning for classical and quantum data. He has received the Inaugural Arden L. Bement Jr. Award (2015) and the Flajolet Lecture Prize (2020), among others. In 2021, he was elected to the Academia Europaea. In 2008, he launched the interdisciplinary Institute for Science of Information, and in 2010, he became the Director of the NSF Science and Technology Center for Science of Information.

Table of Contents

Part I. Known Sources: 1. Preliminaries; 2. Shannon and Huffman FV codes; 3. Tunstall and Khodak VF codes; 4. Divide-and-conquer VF codes; 5. Khodak VV codes; 6. Non-prefix one-to-one codes; 7. Advanced data structures: tree compression; 8. Graph and structure compression; Part II. Universal Codes: 9. Minimax redundancy and regret; 10. Redundancy of universal memoryless sources; 11. Markov types and redundancy for Markov sources; 12. Non-Markovian sources: redundancy of renewal processes; A. Probability; B. Generating functions; C. Complex asymptotics; D. Mellin transform and Tauberian theorems; E. Exponential sums and uniform distribution mod 1; F. Diophantine approximation; References; Index.
From the B&N Reads Blog

Customer Reviews