Learning with the Minimum Description Length Principle
This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning.

The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. 

Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.

1143206354
Learning with the Minimum Description Length Principle
This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning.

The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. 

Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.

129.0 In Stock
Learning with the Minimum Description Length Principle

Learning with the Minimum Description Length Principle

by Kenji Yamanishi
Learning with the Minimum Description Length Principle

Learning with the Minimum Description Length Principle

by Kenji Yamanishi

eBook2023 (2023)

$129.00 

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

This book introduces readers to the minimum description length (MDL) principle and its applications in learning. The MDL is a fundamental principle for inductive inference, which is used in many applications including statistical modeling, pattern recognition and machine learning. At its core, the MDL is based on the premise that “the shortest code length leads to the best strategy for learning anything from data.” The MDL provides a broad and unifying view of statistical inferences such as estimation, prediction and testing and, of course, machine learning.

The content covers the theoretical foundations of the MDL and broad practical areas such as detecting changes and anomalies, problems involving latent variable models, and high dimensional statistical inference, among others. The book offers an easy-to-follow guide to the MDL principle, together with other information criteria, explaining the differences between their standpoints. 

Written in a systematic, concise and comprehensive style, this book is suitable for researchers and graduate students of machine learning, statistics, information theory and computer science.


Product Details

ISBN-13: 9789819917907
Publisher: Springer-Verlag New York, LLC
Publication date: 09/14/2023
Sold by: Barnes & Noble
Format: eBook
File size: 46 MB
Note: This product may take a few minutes to download.

About the Author

Kenji Yamanishi is a Professor at the Graduate School of Information Science and Technology, University of Tokyo, Japan. After completing the master course at the Graduate School of University of Tokyo, he joined NEC Corporation in 1987. He received his doctorate (in Engineering) from the University of Tokyo in 1992 and joined the University faculty in 2009. His research interests and contributions are in the theory of the minimum description length principle, information-theoretic learning theory, and data science applications such as anomaly detection and text mining.

Table of Contents

Information and Coding.- Parameter Estimation.- Model Selection.- Latent Variable Model Selection.- Sequential Prediction.- MDL Change Detection.- Continuous Model Selection.- Extension of Stochastic Complexity.- Mathematical Preliminaries.
From the B&N Reads Blog

Customer Reviews