Boosting: Foundations and Algorithms
An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.

1110919273
Boosting: Foundations and Algorithms
An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.

50.0 In Stock
Boosting: Foundations and Algorithms

Boosting: Foundations and Algorithms

by Robert E. Schapire, Yoav Freund
Boosting: Foundations and Algorithms

Boosting: Foundations and Algorithms

by Robert E. Schapire, Yoav Freund

Paperback

$50.00 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.


Product Details

ISBN-13: 9780262526036
Publisher: MIT Press
Publication date: 01/10/2014
Series: Adaptive Computation and Machine Learning series
Pages: 544
Product dimensions: 6.60(w) x 8.60(h) x 1.00(d)
Age Range: 18 Years

About the Author

Robert E. Schapire is Principal Researcher at Microsoft Research in New York City. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

Yoav Freund is Professor of Computer Science at the University of California, San Diego. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

Table of Contents

Series Foreword xi

Preface xiii

1 Introduction and Overview 1

1.1 Classification Problems and Machine Learning 2

1.2 Boosting 4

1.3 Resistance to Overfitting and the Margins Theory 14

1.4 Foundations and Algorithms 17

Summary 19

Bibliographic Notes 19

Exercises 20

I Core Analysis 21

2 Foundations of Machine Learning 23

2.1 A Direct Approach to Machine Learning 24

2.2 General Methods of Analysis 30

2.3 A Foundation for the Study of Boosting Algorithms 43

Summary 49

Bibliographic Notes 49

Exercises 50

3 Using AdaBoost to Minimize Training Error 53

3.1 A Bound on AdaBoost's Training Error 54

3.2 A Sufficient Condition for Weak Learnability 56

3.3 Relation to Chernoff Bounds 60

3.4 Using and Designing Base Learning Algorithms 62

Summary 70

Bibliographic Notes 71

Exercises 71

4 Direct Bounds on the Generalization Error 75

4.1 Using VC Theory to Bound the Generalization Error 75

4.2 Compression-Based Bounds 83

4.3 The Equivalence of Strong and Weak Learnability 86

Summary 88

Bibliographic Notes 89

Exercises 89

5 The Margins Explanation for Boosting's Effectiveness 93

5.1 Margin as a Measure of Confidence 94

5.2 A Margins-Based Analysis of the Generalization Error 97

5.3 Analysis Based on Rademacher Complexity 106

5.4 The Effect of Boosting on Margin Distributions 111

5.5 Bias, Variance, and Stability 117

5.6 Relation to Support-Vector Machines 122

5.7 Practical Applications of Margins 128

Summary 132

Bibliographic Notes 132

Exercises 134

II Fundamental Perspectives 139

6 Game Theory, Online Learning, and Boosting 141

6.1 Game Theory 142

6.2 Learning in Repeated Game Playing 145

6.3 Online Prediction 153

6.4 Boosting 157

6.5 Application to a "Mind-Reading" Game 163

Summary 169

Bibliographic Notes 169

Exercises 170

7 Loss Minimization and Generalizations of Boosting 175

7.1 AdaBoost's Loss Function 177

7.2 Coordinate Descent 179

7.3 Loss Minimization Cannot Explain Generalization 184

7.4 Functional Gradient Descent 188

7.5 Logistic Regression and Conditional Probabilities 194

7.6 Regularization 202

7.7 Applications to Data-Limited Learning 211

Summary 219

Bibliographic Notes 219

Exercises 220

8 Boosting, Convex Optimization, and Information Geometry 227

8.1 Iterative Projection Algorithms 228

8.2 Proving the Convergence of AdaBoost 243

8.3 Unification with Logistic Regression 252

8.4 Application to Species Distribution Modeling 255

Summary 260

Bibliographic Notes 262

Exercises 263

III Algorithmic Extensions 269

9 Using Confidence-Rated Weak Predictions 271

9.1 The Framework 273

9.2 General Methods for Algorithm Design 275

9.3 Learning Rule-Sets 287

9.4 Alternating Decision Trees 290

Summary 296

Bibliographic Notes 297

Exercises 297

10 Multiclass Classification Problems 303

10.1 A Direct Extension to the Multiclass Case 305

10.2 The One-against-All Reduction and Multi-label Classification 310

10.3 Application to Semantic Classification 316

10.4 General Reductions Using Output Codes 320

Summary 333

Bibliographic Notes 333

Exercises 334

11 Learning to Rank 341

11.1 A Formal Framework for Ranking Problems 342

11.2 A Boosting Algorithm for the Ranking Task 345

11.3 Methods for Improving Efficiency 351

11.4 Multiclass, Multi-label Classification 361

11.5 Applications 364

Summary 367

Bibliographic Notes 369

Exercises 369

IV Advanced Theory 375

12 Attaining the Best Possible Accuracy 377

12.1 Optimality in Classification and Risk Minimization 378

12.2 Approaching the Optimal Risk 382

12.3 How Minimizing Risk Can Lead to Poor Accuracy 398

Summary 406

Bibliographic Notes 406

Exercises 407

13 Optimally Efficient Boosting 415

13.1 The Boost-by-Majority Algorithm 416

13.2 Optimal Generalization Error 432

13.3 Relation to AdaBoost 448

Summary 453

Bibliographic Notes 453

Exercises 453

14 Boosting in Continuous Time 459

14.1 Adaptiveness in the Limit of Continuous Time 460

14.2 BrownBoost 468

14.3 AdaBoost as a Special Case of BrownBoost 476

14.4 Experiments with Noisy Data 483

Summary 485

Bibliographic Notes 486

Exercises 486

Appendix: Some Notation, Definitions, and Mathematical Background 491

A.1 General Notation 491

A.2 Norms 492

A.3 Maxima, Minima, Suprema, and Infima 493

A.4 Limits 493

A.5 Continuity, Closed Sets, and Compactness 494

A.6 Derivatives, Gradients, and Taylor's Theorem 495

A.7 Convexity 496

A.8 The Method of Lagrange Multipliers 497

A.9 Some Distributions and the Central Limit Theorem 498

Bibliography 501

Index of Algorithms, Figures, and Tables 511

Subject and Author Index 513

What People are Saying About This

Peter Bartlett

An outstanding text, which provides an authoritative, self-contained, broadly accessible and very readable treatment of boosting methods, a widely applied family of machine learning algorithms pioneered by the authors. It nicely covers the spectrum from theory through methodology to applications.

Endorsement

Boosting is an amazing machine learning algorithm of 'intelligence' with much success in practice. It allows a weak learner to adapt to the data at hand and become 'strong'; it seamlessly integrates statistical estimation and computation. In this book, Robert Schapire and Yoav Freund, two inventors of the field, present multiple, fascinating views of boosting to explain why and how it works.

Bin Yu, University of California, Berkeley

From the Publisher

"Robert Schapire and Yoav Freund made a huge impact in machine and statistical learning with their invention of boosting, which has survived the test of time. There have been lively discussions about alternative explanations of why it works so well, and the jury is still out. This well-balanced book from the 'masters' covers boosting from all points of view, and gives easy access to the wealth of research that this field has produced." — Trevor Hastie,Statistics Department, Stanford University

"Boosting has provided a platform for thinking about and designing machine learning algorithms for over 20 years. The simple and elegant idea behind boosting is a 'Mirror of Erised' that researchers view from many different perspectives. This book beautifully ties together these views, using the same limpid style found in Robert Schapire and Yoav Freund's original research papers. It's an important resource for machine learning research." — John Lafferty, University of Chicago and Carnegie Mellon University

"An outstanding text, which provides an authoritative, self-contained,broadly accessible and very readable treatment of boosting methods, a widely applied family of machine learning algorithms pioneered by the authors. It nicely covers the spectrum from theory through methodology to applications." — Peter Bartlett, UC Berkeley

"Boosting is an amazing machine learning algorithm of 'intelligence' with much success in practice. It allows a weak learner to adapt to the data at hand and become 'strong'; it seamlessly integrates statistical estimation and computation. In this book, Robert Schapire and Yoav Freund, two inventors of the field, present multiple, fascinating views of boosting to explain why and how it works." — Bin Yu, University of California, Berkeley

John Lafferty

Boosting has provided a platform for thinking about and designing machine learning algorithms for over 20 years. The simple and elegant idea behind boosting is a 'Mirror of Erised' that researchers view from many different perspectives. This book beautifully ties together these views, using the same limpid style found in Robert Schapire and Yoav Freund's original research papers. It's an important resource for machine learning research.

Bin Yu

Boosting is an amazing machine learning algorithm of 'intelligence' with much success in practice. It allows a weak learner to adapt to the data at hand and become 'strong'; it seamlessly integrates statistical estimation and computation. In this book, Robert Schapire and Yoav Freund, two inventors of the field, present multiple, fascinating views of boosting to explain why and how it works.

Trevor Hastie

Robert Schapire and Yoav Freund made a huge impact in machine and statistical learning with their invention of boosting, which has survived the test of time. There have been lively discussions about alternative explanations of why it works so well, and the jury is still out. This well-balanced book from the 'masters' covers boosting from all points of view, and gives easy access to the wealth of research that this field has produced.

From the B&N Reads Blog

Customer Reviews