Introduction to Bayesian Statistics

 
 
John Wiley & Sons Inc (Verlag)
  • 3. Auflage
  • |
  • erschienen am 23. August 2016
  • |
  • 624 Seiten
 
E-Book | PDF mit Adobe DRM | Systemvoraussetzungen
978-1-118-59315-8 (ISBN)
 
"...this edition is useful and effective in teaching Bayesian inference at both elementary and intermediate levels. It is a well-written book on elementary Bayesian inference, and the material is easily accessible. It is both concise and timely, and provides a good collection of overviews and reviews of important tools used in Bayesian statistical methods."
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian statistics. The authors continue to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inference for discrete random variables, binomial proportions, Poisson, and normal means, and simple linear regression. In addition, more advanced topics in the field are presented in four new chapters: Bayesian inference for a normal with unknown mean and variance; Bayesian inference for a Multivariate Normal mean vector; Bayesian inference for the Multiple Linear Regression Model; and Computational Bayesian Statistics including Markov Chain Monte Carlo. The inclusion of these topics will facilitate readers' ability to advance from a minimal understanding of Statistics to the ability to tackle topics in more applied, advanced level books. Minitab macros and R functions are available on the book's related website to assist with chapter exercises. Introduction to Bayesian Statistics, Third Edition also features:
* Topics including the Joint Likelihood function and inference using independent Jeffreys priors and join conjugate prior
* The cutting-edge topic of computational Bayesian Statistics in a new chapter, with a unique focus on Markov Chain Monte Carlo methods
* Exercises throughout the book that have been updated to reflect new applications and the latest software applications
* Detailed appendices that guide readers through the use of R and Minitab software for Bayesian analysis and Monte Carlo simulations, with all related macros available on the book's website
Introduction to Bayesian Statistics, Third Edition is a textbook for upper-undergraduate or first-year graduate level courses on introductory statistics course with a Bayesian emphasis. It can also be used as a reference work for statisticians who require a working knowledge of Bayesian statistics.
3. Auflage
  • Englisch
  • New York
  • |
  • USA
  • Für Beruf und Forschung
  • Überarbeitete Ausgabe
  • 9,30 MB
978-1-118-59315-8 (9781118593158)
1118593154 (1118593154)
weitere Ausgaben werden ermittelt
WILLIAM M. BOLSTAD, PhD, is a retired Senior Lecturer in the Department of Statistics at The University of Waikato, New Zealand. Dr. Bolstad's research interests include Bayesian statistics, MCMC methods, recursive estimation techniques, multiprocess dynamic time series models, and forecasting. He is author of Understanding Computational Bayesian Statistics, also published by Wiley.
JAMES M. CURRAN is a Professor of Statistics in the Department of Statistics at the University of Auckland, New Zealand. Professor Curran's research interests include the statistical interpretation of forensic evidence, statistical computing, experimental design, and Bayesian statistics. He is the author of two other books including Introduction to Data Analysis with R for Forensic Scientists, published by Taylor and Francis through its CRC brand.
  • INTRODUCTION TO BAYESIAN STATISTICS
  • Contents
  • Preface
  • Changes in the Third Edition
  • Our Perspective on Bayesian Statistics
  • Acknowledgments
  • 1 Introduction to Statistical Science
  • 1.1 The Scientific Method: A Process for Learning
  • 1.2 The Role of Statistics in the Scientific Method
  • 1.3 Main Approaches to Statistics
  • Frequentist Approach to Statistics
  • Bayesian Approach to Statistics
  • Monte Carlo Studies
  • 1.4 Purpose and Organization of This Text
  • 2 Scientific Data Gathering
  • 2.1 Sampling from a Real Population
  • Simple Random Sampling (without Replacement)
  • Stratified Random Sampling
  • Non-sampling Errors in Sample Surveys
  • Randomized Response Methods
  • 2.2 Observational Studies and Designed Experiments
  • Observational Study
  • Designed Experiment
  • Monte Carlo Exercises
  • 3 Displaying and Summarizing Data
  • 3.1 Graphically Displaying a Single Variable
  • Dotplot
  • Boxplot (Box-and-Whisker Plot)
  • Stem-and-Leaf Diagram
  • Frequency Table
  • Histogram
  • Cumulative Frequency Polygon
  • 3.2 Graphically Comparing Two Samples
  • 3.3 Measures of Location
  • Mean: Advantages and Disadvantages
  • Median: Advantages and Disadvantages
  • 3.4 Measures of Spread
  • Range: Advantage and Disadvantage
  • Interquartile Range: Advantages and Disadvantages
  • Variance: Advantages and Disadvantages
  • Standard Deviation: Advantages and Disadvantages
  • 3.5 Displaying Relationships Between Two or More Variables
  • Scatterplot
  • Scatterplot Matrix
  • 3.6 Measures of Association for Two or More Variables
  • Covariance and Correlation between Two Variables
  • Exercises
  • 4 Logic, Probability, and Uncertainty
  • 4.1 Deductive Logic and Plausible Reasoning
  • Desired Properties of Plausibility Measures
  • 4.2 Probability
  • 4.3 Axioms of Probability
  • 4.4 Joint Probability and Independent Events
  • 4.5 Conditional Probability
  • 4.6 Bayes' Theorem
  • Bayes' Theorem: The Key to Bayesian Statistics
  • 4.7 Assigning Probabilities
  • 4.8 Odds and Bayes Factor
  • Bayes Factor (B)
  • 4.9 Beat the Dealer
  • Exercises
  • 5 Discrete Random Variables
  • 5.1 Discrete Random Variables
  • 5.2 Probability Distribution of a Discrete Random Variable
  • Expected Value of a Discrete Random Variable
  • The Variance of a Discrete Random Variable
  • The Mean and Variance of a Linear Function of a Random Variable
  • 5.3 Binomial Distribution
  • Characteristics of the Binomial Distribution
  • 5.4 Hypergeometric Distribution
  • Probability Function of Hypergeometric
  • 5.5 Poisson Distribution
  • Characteristics of the Poisson Distribution
  • 5.6 Joint Random Variables
  • Independent Random Variables
  • 5.7 Conditional Probability for Joint Random Variables
  • Exercises
  • 6 Bayesian Inference for Discrete Random Variables
  • 6.1 Two Equivalent Ways of Using Bayes' Theorem
  • 6.2 Bayes' Theorem for Binomial with Discrete Prior
  • Setting up the Table for Bayes' Theorem on Binomial with Discrete Prior
  • 6.3 Important Consequences of Bayes' Theorem
  • 6.4 Bayes' Theorem for Poisson with Discrete Prior
  • Setting up the Table for Bayes' Theorem on Poisson with Discrete Prior
  • Exercises
  • Computer Exercises
  • 7 Continuous Random Variables
  • 7.1 Probability Density Function
  • Mean of a Continuous Random Variable
  • Variance of a Continuous Random Variable
  • 7.2 Some Continuous Distributions
  • Uniform Distribution
  • Beta Family of Distributions
  • Gamma Family of Distributions
  • Normal Distribution
  • 7.3 Joint Continuous Random Variables
  • Conditional Probability Density
  • 7.4 Joint Continuous and Discrete Random Variables
  • Exercises
  • 8 Bayesian Inference for Binomial Proportion
  • 8.1 Using a Uniform Prior
  • 8.2 Using a Beta Prior
  • Conjugate Family of Priors for Binomial Observation is the Beta Family
  • 8.3 Choosing Your Prior
  • Choosing a Conjugate Prior When You Have Vague Prior Knowledge
  • Choosing a Conjugate Prior When You Have Real Prior Knowledge by Matching Location and Scale
  • Precautions Before Using Your Conjugate Prior
  • Constructing a General Continuous Prior
  • Effect of the Prior
  • 8.4 Summarizing the Posterior Distribution
  • Measures of Location
  • Measures of Spread
  • 8.5 Estimating the Proportion
  • 8.6 Bayesian Credible Interval
  • Bayesian Credible Interval for Ï?
  • Exercises
  • Computer Exercises
  • 9 Comparing Bayesian and Frequentist Inferences for Proportion
  • 9.1 Frequentist Interpretation of Probability and Parameters
  • Sampling Distribution of Statistic
  • 9.2 Point Estimation
  • Frequentist Criteria for Evaluating Estimators
  • Unbiased Estimators
  • Minimum Variance Unbiased Estimator
  • Mean Squared Error of an Estimator
  • 9.3 Comparing Estimators for Proportion
  • 9.4 Interval Estimation
  • Confidence Intervals
  • Comparing Confidence and Credible Intervals for Ï?
  • 9.5 Hypothesis Testing
  • 9.6 Testing a One-Sided Hypothesis
  • Frequentist Test of One-Sided Hypothesis
  • 9.7 Testing a Two-Sided Hypothesis
  • Frequentist Test of a Two-Sided Hypothesis
  • Bayesian Test of a Two-Sided Hypothesis
  • Exercises
  • Monte Carlo Exercises
  • 10 Bayesian Inference for Poisson
  • 10.1 Some Prior Distributions for Poisson
  • Summarizing the Posterior Distribution
  • 10.2 Inference for Poisson Parameter
  • Point Estimation
  • Bayesian Credible Interval for μ
  • Bayesian Test of a One-Sided Hypothesis
  • Bayesian Test of a Two-Sided Hypothesis
  • Exercises
  • Computer Exercises
  • 11 Bayesian Inference for Normal Mean
  • 11.1 Bayes' Theorem for Normal Mean with a Discrete Prior
  • For a Single Normal Observation
  • Likelihood of Single Observation
  • Table for Performing Bayes' Theorem
  • For a Random Sample of Normal Observations
  • 11.2 Bayes' Theorem for Normal Mean with a Continuous Prior
  • Flat Prior Density for μ (Jeffrey's Prior for Normal Mean)
  • Normal Prior Density for μ
  • 11.3 Choosing Your Normal Prior
  • 11.4 Bayesian Credible Interval for Normal Mean
  • Known Variance
  • Unknown Variance
  • Nonnormal Prior
  • 11.5 Predictive Density for Next Observation
  • Exercises
  • Computer Exercises
  • 12 Comparing Bayesian and Frequentist Inferences for Mean
  • 12.1 Comparing Frequentist and Bayesian Point Estimators
  • 12.2 Comparing Confidence and Credible Intervals for Mean
  • Relationship between Frequentist Confidence Interval and Bayesian Credible Interval from "Flat" Prior
  • 12.3 Testing a One-Sided Hypothesis about a Normal Mean
  • Frequentist One-Sided Hypothesis Test about μ
  • Bayesian One-Sided Hypothesis Test about μ
  • 12.4 Testing a Two-Sided Hypothesis about a Normal Mean
  • Frequentist Two-Sided Hypothesis Test About μ
  • Bayesian Two-Sided Hypothesis Test about μ
  • Exercises
  • 13 Bayesian Inference for Difference Between Means
  • 13.1 Independent Random Samples from Two Normal Distributions
  • 13.2 Case 1: Equal Variances
  • When the Variance Is Known
  • When the Variance Is Unknown and Flat Priors Are Used
  • 13.3 Case 2: Unequal Variances
  • When the Variances Are Known
  • When the Variances Are Unknown
  • 13.4 Bayesian Inference for Difference Between Two Proportions Using Normal Approximation
  • 13.5 Normal Random Samples from Paired Experiments
  • Take Differences within Each Pair
  • Exercises
  • 14 Bayesian Inference for Simple Linear Regression
  • 14.1 Least Squares Regression
  • The Normal Equations and the Least Squares Line
  • Estimating the Variance around the Least Squares Line
  • 14.2 Exponential Growth Model
  • 14.3 Simple Linear Regression Assumptions
  • 14.4 Bayes' Theorem for the Regression Model
  • The Joint Likelihood for β and αx
  • The Joint Prior for β and αx
  • The Joint Posterior for β and αx
  • Bayesian Credible Interval for Slope
  • Frequentist Confidence Interval for Slope
  • Testing One-Sided Hypothesis about Slope
  • Testing Two-Sided Hypothesis about Slope
  • 14.5 Predictive Distribution for Future Observation
  • Finding the Predictive Distribution
  • Exercises
  • Computer Exercises
  • 15 Bayesian Inference for Standard Deviation
  • 15.1 Bayes' Theorem for Normal Variance with a Continuous Prior
  • 15.2 Some Specific Prior Distributions and the Resulting Posteriors
  • Positive Uniform Prior Density for Variance
  • Positive Uniform Prior Density for Standard Deviation
  • Jeffreys' Prior Density
  • Inverse Chi-squared Prior
  • Inverse gamma priors
  • 15.3 Bayesian Inference for Normal Standard Deviation
  • Bayesian Estimators for Ï?
  • Bayesian Credible Interval for Ï?
  • Testing a One-Sided Hypothesis about Ï?
  • Exercises
  • Computer Exercises
  • 16 Robust Bayesian Methods
  • 16.1 Effect of Misspecified Prior
  • 16.2 Bayes' Theorem with Mixture Priors
  • The Mixture Prior
  • The Joint Posterior
  • The Mixture Posterior
  • Summary
  • Exercises
  • Computer Exercises
  • 17 Bayesian Inference for Normal with Unknown Mean and Variance
  • 17.1 The Joint Likelihood Function
  • 17.2 Finding the Posterior when Independent Jeffreys' Priors for μ and Ï?2 Are Used
  • Finding the Marginal Posterior for μ
  • Another Way to Find the Marginal Posterior
  • 17.3 Finding the Posterior when a Joint Conjugate Prior for μ and Ï?2 Is Used
  • The Joint Conjugate Prior
  • Finding the Marginal Posterior for μ
  • An Approximation to the Marginal Posterior for μ
  • 17.4 Difference Between Normal Means with Equal Unknown Variance
  • Finding the Posterior when Independent Jeffreys' Priors are Used for all Parameters
  • Finding the Exact Posterior when the Joint Conjugate Prior Is Used for All Parameters
  • Finding the Approximate Posterior when the Joint Conjugate Prior Is Used for All Parameters
  • 17.5 Difference Between Normal Means with Unequal Unknown Variances
  • Computer Exercises
  • Appendix: Proof that the Exact Marginal Posterior Distribution of μ is Student's t
  • Using Independent Jeffreys' Priors
  • Using the Joint Conjugate Prior
  • Difference Between Means with Independent Jeffreys' Priors
  • Difference Between Means with Joint Conjugate Prior
  • 18 Bayesian Inference for Multivariate Normal Mean Vector
  • 18.1 Bivariate Normal Density
  • Bivariate Normal Density in Matrix Notation
  • 18.2 Multivariate Normal Distribution
  • 18.3 The Posterior Distribution of the Multivariate Normal Mean Vector when Covariance Matrix Is Known
  • A Single Multivariate Normal Observation
  • A Random Sample from the Multivariate Normal Distribution
  • 18.4 Credible Region for Multivariate Normal Mean Vector when Covariance Matrix Is Known
  • Testing Point Hypothesis Using the Credible Region
  • 18.5 Multivariate Normal Distribution with Unknown Covariance Matrix
  • Inverse Wishart Distribution
  • Likelihood for Multivariate Normal Distribution with Unknown Covariance Matrix
  • Finding the Exact Posterior when the Joint Conjugate Prior is Used for all Parameters
  • Computer Exercises
  • 19 Bayesian Inference for the Multiple Linear Regression Model
  • 19.1 Least Squares Regression for Multiple Linear Regression Model
  • 19.2 Assumptions of Normal Multiple Linear Regression Model
  • 19.3 Bayes' Theorem for Normal Multiple Linear Regression Model
  • Likelihood of Single Observation
  • Likelihood of a Random Sample of Observations
  • Finding the Posterior when a Multivariate Continuous Prior is Used
  • Finding the Posterior when a Multivariate Flat Prior is Used
  • Finding the Posterior when a Multivariate Normal Prior is Used
  • 19.4 Inference in the Multivariate Normal Linear Regression Model
  • Inference on a Single Slope Parameter
  • Inference for the Vector of all Slopes
  • Credible Region for all the Slopes
  • Testing a Point Hypothesis about all the Slopes
  • Modeling Issues: Removing Unnecessary Variables
  • 19.5 The Predictive Distribution for a Future Observation
  • Computer Exercises
  • 20 Computational Bayesian Statistics Including Markov Chain Monte Carlo
  • 20.1 Direct Methods for Sampling from the Posterior
  • Inverse Probability Sampling
  • Acceptance-Rejection Sampling
  • Adaptive-Rejection Sampling
  • 20.2 Samplingâ??Importanceâ??Resampling
  • 20.3 Markov Chain Monte Carlo Methods
  • Markov Chains
  • Metropolis-Hastings Algorithm for a Single Parameter
  • Gibbs Sampling
  • 20.4 Slice Sampling
  • 20.5 Inference from a Posterior Random Sample
  • Posterior Inference from Samples Taken Using Markov Chains
  • 20.6 Where to Next?
  • A Introduction to Calculus
  • B Use of Statistical Tables
  • C Using the Included Minitab Macros
  • D Using the Included R Functions
  • E Answers to Selected Exercises
  • References
  • Index

Dateiformat: PDF
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat PDF zeigt auf jeder Hardware eine Buchseite stets identisch an. Daher ist eine PDF auch für ein komplexes Layout geeignet, wie es bei Lehr- und Fachbüchern verwendet wird (Bilder, Tabellen, Spalten, Fußnoten). Bei kleinen Displays von E-Readern oder Smartphones sind PDF leider eher nervig, weil zu viel Scrollen notwendig ist. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

120,99 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
PDF mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung des WebShops erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok