Learning Probabilistic Graphical Models in R

Packt Publishing Limited
  • 1. Auflage
  • |
  • erschienen am 29. April 2016
  • |
  • 250 Seiten
E-Book | ePUB mit Adobe DRM | Systemvoraussetzungen
978-1-78439-741-8 (ISBN)
Familiarize yourself with probabilistic graphical models through real-world problems and illustrative code examples in RAbout This BookPredict and use a probabilistic graphical models (PGM) as an expert systemComprehend how your computer can learn Bayesian modeling to solve real-world problemsKnow how to prepare data and feed the models by using the appropriate algorithms from the appropriate R packageWho This Book Is ForThis book is for anyone who has to deal with lots of data and draw conclusions from it, especially when the data is noisy or uncertain. Data scientists, machine learning enthusiasts, engineers, and those who curious about the latest advances in machine learning will find PGM interesting.What You Will LearnUnderstand the concepts of PGM and which type of PGM to use for which problemTune the model's parameters and explore new models automaticallyUnderstand the basic principles of Bayesian models, from simple to advancedTransform the old linear regression model into a powerful probabilistic modelUse standard industry models but with the power of PGMUnderstand the advanced models used throughout today's industrySee how to compute posterior distribution with exact and approximate inference algorithmsIn DetailProbabilistic graphical models (PGM, also known as graphical models) are a marriage between probability theory and graph theory. Generally, PGMs use a graph-based representation. Two branches of graphical representations of distributions are commonly used, namely Bayesian networks and Markov networks. R has many packages to implement graphical models.We'll start by showing you how to transform a classical statistical model into a modern PGM and then look at how to do exact inference in graphical models. Proceeding, we'll introduce you to many modern R packages that will help you to perform inference on the models. We will then run a Bayesian linear regression and you'll see the advantage of going probabilistic when you want to do prediction.Next, you'll master using R packages and implementing its techniques. Finally, you'll be presented with machine learning applications that have a direct impact in many fields. Here, we'll cover clustering and the discovery of hidden information in big data, as well as two important methods, PCA and ICA, to reduce the size of big problems.Style and approach This book gives you a detailed and step-by-step explanation of each mathematical concept, which will help you build and analyze your own machine learning models and apply them to real-world problems. The mathematics is kept simple and each formula is explained thoroughly.
  • Englisch
  • Birmingham
  • |
  • Großbritannien
978-1-78439-741-8 (9781784397418)
1784397415 (1784397415)
weitere Ausgaben werden ermittelt
David Bellot is a PhD graduate in computer science from INRIA, France, with a focus on Bayesian machine learning. He was a postdoctoral fellow at the University of California, Berkeley, and worked for companies such as Intel, Orange, and Barclays Bank. He currently works in the financial industry, where he develops financial market prediction algorithms using machine learning. He is also a contributor to open source projects such as the Boost C++ library.
  • Cover
  • Copyright
  • Credits
  • About the Author
  • About the Reviewers
  • www.PacktPub.com
  • Table of Contents
  • Preface
  • Chapter 1: Probabilistic Reasoning
  • Machine learning
  • Representing uncertainty with probabilities
  • Beliefs and uncertainty as probabilities
  • Conditional probability
  • Probability calculus and random variables
  • Sample space, events, and probability
  • Random variables and probability calculus
  • Joint probability distributions
  • Bayes' rule
  • Interpreting the Bayes' formula
  • A first example of Bayes' rule
  • A first example of Bayes' rule in R
  • Probabilistic graphical models
  • Probabilistic models
  • Graphs and conditional independence
  • Factorizing a distribution
  • Directed models
  • Undirected models
  • Examples and applications
  • Summary
  • Chapter 2: Exact Inference
  • Building graphical models
  • Types of random variable
  • Building graphs
  • Probabilistic expert system
  • Basic structures in probabilistic graphical models
  • Variable elimination
  • Sum-product and belief updates
  • The junction tree algorithm
  • Examples of probabilistic graphical models
  • The sprinkler example
  • The medical expert system
  • Models with more than two layers
  • Tree structure
  • Summary
  • Chapter 3: Learning Parameters
  • Introduction
  • Learning by inference
  • Maximum likelihood
  • How are empirical and model distribution related?
  • The ML algorithm and its implementation in R
  • Application
  • Learning with hidden variables - the EM algorithm
  • Latent variables
  • Principles of the EM algorithm
  • Derivation of the EM algorithm
  • Applying EM to graphical models
  • Summary
  • Chapter 4: Bayesian Modeling - Basic Models
  • The Naive Bayes model
  • Representation
  • Learning the Naive Bayes model
  • Bayesian Naive Bayes
  • Beta-Binomial
  • The prior distribution
  • The posterior distribution with the conjugacy property
  • Which values should we choose for the Beta parameters?
  • The Gaussian mixture model
  • Definition
  • Summary
  • Chapter 5: Approximate Inference
  • Sampling from a distribution
  • Basic sampling algorithms
  • Standard distributions
  • Rejection sampling
  • An implementation in R
  • Importance sampling
  • An implementation in R
  • Markov Chain Monte-Carlo
  • General idea of the method
  • The Metropolis-Hastings algorithm
  • MCMC for probabilistic graphical models in R
  • Installing Stan and RStan
  • A simple example in RStan
  • Summary
  • Chapter 6: Bayesian Modeling - Linear Models
  • Linear regression
  • Estimating the parameters
  • Bayesian linear models
  • Over-fitting a model
  • Graphical model of a linear model
  • Posterior distribution
  • Implementation in R
  • A stable implementation
  • More packages in R
  • Summary
  • Chapter 7: Probabilistic Mixture Models
  • Mixture models
  • EM for mixture models
  • Mixture of Bernoulli
  • Mixture of experts
  • Latent Dirichlet Allocation
  • The LDA model
  • Variational inference
  • Examples
  • Summary
  • Appendix
  • References
  • Books on the Bayesian theory
  • Books on machine learning
  • Papers
  • Index

Dateiformat: EPUB
Kopierschutz: Adobe-DRM (Digital Rights Management)


Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat EPUB ist sehr gut für Romane und Sachbücher geeignet - also für "fließenden" Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.

Download (sofort verfügbar)

28,05 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
ePUB mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung dieser Web-Seiten erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok