Information Theory, Inference and Learning Algorithms / Edition 1by David J. C. MacKay
Pub. Date: 10/31/2003
Publisher: Cambridge University Press
Most Helpful Customer Reviews
See all customer reviews
This book was my route from physics into the world of information theory, machine learning and inference. Always entertaining, interactive and filled with intuitions: I still use it every day.
Over the last year and a half this book has been invaluable (and parts of it a fun diversion). For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems. While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on. I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic. Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.