As one of the most comprehensive machine learning texts around, this book does justice to the field's incredible richness, but without losing sight of the unifying principles. Peter Flach's clear, example-based approach begins by discussing how a spam filter works, which gives an immediate introduction to machine learning in action, with a minimum of technical fuss. Flach provides case studies of increasing complexity and variety with well-chosen examples and illustrations throughout. He covers a wide range of logical, geometric and statistical models and state-of-the-art topics such as matrix factorisation and ROC analysis. Particular attention is paid to the central role played by features. The use of established terminology is balanced with the introduction of new and useful concepts, and summaries of relevant background material are provided with pointers for revision if necessary. These features ensure Machine Learning will set a new standard as an introductory textbook.
|Publisher:||Cambridge University Press|
|Edition description:||New Edition|
|Product dimensions:||7.40(w) x 9.60(h) x 0.90(d)|
About the Author
Peter Flach has more than twenty years of experience in machine learning teaching and research. He is Editor-in-Chief of Machine Learning and Program Co-Chair of the 2009 ACM Conference on Knowledge Discovery and Data Mining and the 2012 European Conference on Machine Learning and Data Mining. His research spans all aspects of machine learning, from knowledge representation and the use of logic to learn from highly structured data to the analysis and evaluation of machine learning models and methods to large-scale data mining. He is particularly known for his innovative use of Receiver Operating Characteristic (ROC) analysis for understanding and improving machine learning methods. These innovations have proved their effectiveness in a number of invited talks and tutorials and now form the backbone of this book.
Table of Contents
Prologue: a machine learning sampler; 1. The ingredients of machine learning; 2. Binary classification and related tasks; 3. Beyond binary classification; 4. Concept learning; 5. Tree models; 6. Rule models; 7. Linear models; 8. Distance-based models; 9. Probabilistic models; 10. Features; 11. In brief: model ensembles; 12. In brief: machine learning experiments; Epilogue: where to go from here; Important points to remember; Bibliography; Index.