Applied Machine Learning

Applied Machine Learning

by David Forsyth
ISBN-10:
3030181138
ISBN-13:
9783030181130
Pub. Date:
07/13/2019
Publisher:
Springer International Publishing
ISBN-10:
3030181138
ISBN-13:
9783030181130
Pub. Date:
07/13/2019
Publisher:
Springer International Publishing
Applied Machine Learning

Applied Machine Learning

by David Forsyth
$119.99 Current price is , Original price is $119.99. You
$119.99 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Overview

Machine learning methods are now an important tool for scientists, researchers, engineers and students in a wide range of areas. This book is written for people who want to adopt and use the main tools of machine learning, but aren’t necessarily going to want to be machine learning researchers. Intended for students in final year undergraduate or first year graduate computer science programs in machine learning, this textbook is a machine learning toolkit. Applied Machine Learning covers many topics for people who want to use machine learning processes to get things done, with a strong emphasis on using existing tools and packages, rather than writing one’s own code.

A companion to the author's Probability and Statistics for Computer Science, this book picks up where the earlier book left off (but also supplies a summary of probability that the reader can use).

Emphasizing the usefulness ofstandard machinery from applied statistics, this textbook gives an overview of the major applied areas in learning, including coverage of:
• classification using standard machinery (naive bayes; nearest neighbor; SVM)
• clustering and vector quantization (largely as in PSCS)
• PCA (largely as in PSCS)
• variants of PCA (NIPALS; latent semantic analysis; canonical correlation analysis)
• linear regression (largely as in PSCS)
• generalized linear models including logistic regression
• model selection with Lasso, elasticnet
• robustness and m-estimators
• Markov chains and HMM’s (largely as in PSCS)
• EM in fairly gory detail; long experience teaching this suggests one detailed example is required, which students hate; but once they’ve been through that, the next one is easy
• simple graphical models (in the variational inference section)
• classification with neural networks, with a particular emphasis onimage classification
• autoencoding with neural networks
• structure learning

Product Details

ISBN-13: 9783030181130
Publisher: Springer International Publishing
Publication date: 07/13/2019
Edition description: 1st ed. 2019
Pages: 494
Product dimensions: 7.01(w) x 10.00(h) x (d)

About the Author

David Forsyth grew up in Cape Town. He received a B.Sc. (Elec. Eng.) from the University of the Witwatersrand, Johannesburg in 1984, an M.Sc. (Elec. Eng.) from that university in 1986, and a D.Phil. from Balliol College, Oxford in 1989. He spent three years on the faculty at the University of Iowa, ten years on the faculty at the University of California at Berkeley, and then moved to the University of Illinois. He served as program co-chair for IEEE Computer Vision and Pattern Recognition in 2000, 2011, 2018 and 2021; general co-chair for CVPR 2006 and ICCV 2019, and program co-chair for the European Conference on Computer Vision 2008, and is a regular member of the program committee of all major international conferences on computer vision. He has served six terms on the SIGGRAPH program committee. In 2006, he received an IEEE technical achievement award, in 2009 he was named an IEEE Fellow, and in 2014 he was named an ACM Fellow. He served as Editor-in-Chief of IEEE TPAMI from 2014-2017. He is lead co-author of Computer Vision: A Modern Approach, a textbook of computer vision that ran to two editions and four languages. He is sole author of Probability and Statistics for Computer Science, which provides the background for this book. Among a variety of odd hobbies, he is a compulsive diver, certified up to normoxic trimix level.

Table of Contents

1. Learning to Classify.- 2. SVM’s and Random Forests.- 3. A Little Learning Theory.- 4. High-dimensional Data.- 5. Principal Component Analysis.- 6. Low Rank Approximations.- 7. Canonical Correlation Analysis.- 8. Clustering.- 9. Clustering using Probability Models.- 10. Regression.- 11. Regression: Choosing and Managing Models.- 12. Boosting.- 13. Hidden Markov Models.- 14. Learning Sequence Models Discriminatively.- 15. Mean Field Inference.- 16. Simple Neural Networks.- 17. Simple Image Classifiers.- 18. Classifying Images and Detecting Objects.- 19. Small Codes for Big Signals.- Index.
From the B&N Reads Blog

Customer Reviews