Boosting

Foundations and Algorithms
 
 
The MIT Press
  • 1. Auflage
  • |
  • erschienen am 11. September 2017
  • |
  • 544 Seiten
 
E-Book | PDF mit Adobe DRM | Systemvoraussetzungen
978-0-262-30118-3 (ISBN)
 
Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate &quote,rules of thumb.&quote, A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical. This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well. The book begins with a general introduction to machine learning algorithms and their analysis, then explores the core theory of boosting, especially its ability to generalize, examines some of the myriad other theoretical viewpoints that help to explain and understand boosting, provides practical extensions of boosting for more complex learning problems, and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.
  • Englisch
  • Cambridge, Mass.
  • |
  • USA
978-0-262-30118-3 (9780262301183)
0262301180 (0262301180)
weitere Ausgaben werden ermittelt
  • Contents
  • Series Foreword
  • Preface
  • 1 Introduction and Overview
  • 1.1 Classification Problems and Machine Learning
  • 1.2 Boosting
  • 1.3 Resistance to Overfitting and the Margins Theory
  • 1.4 Foundations and Algorithms
  • I CORE ANALYSIS
  • 2 Foundations of Machine Learning
  • 2.1 A Direct Approach to Machine Learning
  • 2.2 General Methods of Analysis
  • 2.3 A Foundation for the Study of Boosting Algorithms
  • 3 Using AdaBoost to Minimize Training Error
  • 3.1 A Bound on AdaBoost's Training Error
  • 3.2 A Sufficient Condition for Weak Learnability
  • 3.3 Relation to Chernoff Bounds
  • 3.4 Using and Designing Base Learning Algorithms
  • 4 Direct Bounds on the Generalization Error
  • 4.1 Using VC Theory to Bound the Generalization Error
  • 4.2 Compression-Based Bounds
  • 4.3 The Equivalence of Strong and Weak Learnability
  • 5 The Margins Explanation for Boosting's Effectiveness
  • 5.1 Margin as a Measure of Confidence
  • 5.2 A Margins-Based Analysis of the Generalization Error
  • 5.3 Analysis Based on Rademacher Complexity
  • 5.4 The Effect of Boosting on Margin Distributions
  • 5.5 Bias, Variance, and Stability
  • 5.6 Relation to Support-Vector Machines
  • 5.7 Practical Applications of Margins
  • II FUNDAMENTAL PERSPECTIVES
  • 6 Game Theory, Online Learning, and Boosting
  • 6.1 Game Theory
  • 6.2 Learning in Repeated Game Playing
  • 6.3 Online Prediction
  • 6.4 Boosting
  • 6.5 Application to a "Mind-Reading" Game
  • 7 Loss Minimization and Generalizations of Boosting
  • 7.1 AdaBoost's Loss Function
  • 7.2 Coordinate Descent
  • 7.3 Loss Minimization Cannot Explain Generalization
  • 7.4 Functional Gradient Descent
  • 7.5 Logistic Regression and Conditional Probabilities
  • 7.6 Regularization
  • 7.7 Applications to Data-Limited Learning
  • 8 Boosting, Convex Optimization, and Information Geometry
  • 8.1 Iterative Projection Algorithms
  • 8.2 Proving the Convergence of AdaBoost
  • 8.3 Unification with Logistic Regression
  • 8.4 Application to Species Distribution Modeling
  • III ALGORITHMIC EXTENSIONS
  • 9 Using Confidence-Rated Weak Predictions
  • 9.1 The Framework
  • 9.2 General Methods for Algorithm Design
  • 9.3 Learning Rule-Sets
  • 9.4 Alternating Decision Trees
  • 10 Multiclass Classification Problems
  • 10.1 A Direct Extension to the Multiclass Case
  • 10.2 The One-against-All Reduction and Multi-label Classification
  • 10.3 Application to Semantic Classification
  • 10.4 General Reductions Using Output Codes
  • 11 Learning to Rank
  • 11.1 A Formal Framework for Ranking Problems
  • 11.2 A Boosting Algorithm for the Ranking Task
  • 11.3 Methods for Improving Efficiency
  • 11.4 Multiclass, Multi-label Classification
  • 11.5 Applications
  • IV ADVANCED THEORY
  • 12 Attaining the Best Possible Accuracy
  • 12.1 Optimality in Classification and Risk Minimization
  • 12.2 Approaching the Optimal Risk
  • 12.3 How Minimizing Risk Can Lead to Poor Accuracy
  • 13 Optimally Efficient Boosting
  • 13.1 The Boost-by-Majority Algorithm
  • 13.2 Optimal Generalization Error
  • 13.3 Relation to AdaBoost
  • 14 Boosting in Continuous Time
  • 14.1 Adaptiveness in the Limit of Continuous Time
  • 14.2 BrownBoost
  • 14.3 AdaBoost as a Special Case of BrownBoost
  • 14.4 Experiments with Noisy Data
  • Appendix: Some Notation, Definitions, and Mathematical Background
  • A.1 General Notation
  • A.2 Norms
  • A.3 Maxima, Minima, Suprema, and Infima
  • A.4 Limits
  • A.5 Continuity, Closed Sets, and Compactness
  • A.6 Derivatives, Gradients, and Taylor's Theorem
  • A.7 Convexity
  • A.8 The Method of Lagrange Multipliers
  • A.9 Some Distributions and the Central Limit Theorem
  • Bibliography
  • Index of Algorithms, Figures, and Tables
  • Subject and Author Index

Dateiformat: PDF
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat PDF zeigt auf jeder Hardware eine Buchseite stets identisch an. Daher ist eine PDF auch für ein komplexes Layout geeignet, wie es bei Lehr- und Fachbüchern verwendet wird (Bilder, Tabellen, Spalten, Fußnoten). Bei kleinen Displays von E-Readern oder Smartphones sind PDF leider eher nervig, weil zu viel Scrollen notwendig ist. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

106,59 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
PDF mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung des WebShops erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok