Engineering Biostatistics

An Introduction using MATLAB and WinBUGS
 
 
John Wiley & Sons Inc (Verlag)
  • erschienen am 17. Oktober 2017
  • |
  • 984 Seiten
 
E-Book | PDF mit Adobe DRM | Systemvoraussetzungen
978-1-119-16899-7 (ISBN)
 
Provides a one-stop resource for engineers learning biostatistics using MATLAB® and WinBUGS
Through its scope and depth of coverage, this book addresses the needs of the vibrant and rapidly growing bio-oriented engineering fields while implementing software packages that are familiar to engineers. The book is heavily oriented to computation and hands-on approaches so readers understand each step of the programming. Another dimension of this book is in parallel coverage of both Bayesian and frequentist approaches to statistical inference. It avoids taking sides on the classical vs. Bayesian paradigms, and many examples in this book are solved using both methods. The results are then compared and commented upon. Readers have the choice of MATLAB® for classical data analysis and WinBUGS/OpenBUGS for Bayesian data analysis. Every chapter starts with a box highlighting what is covered in that chapter and ends with exercises, a list of software scripts, datasets, and references.
Engineering Biostatistics: An Introduction using MATLAB® and WinBUGS also includes:
* parallel coverage of classical and Bayesian approaches, where appropriate
* substantial coverage of Bayesian approaches to statistical inference
* material that has been classroom-tested in an introductory statistics course in bioengineering over several years
* exercises at the end of each chapter and an accompanying website with full solutions and hints to some exercises, as well as additional materials and examples
Engineering Biostatistics: An Introduction using MATLAB® and WinBUGS can serve as a textbook for introductory-to-intermediate applied statistics courses, as well as a useful reference for engineers interested in biostatistical approaches.
1. Auflage
  • Englisch
  • New York
  • |
  • USA
  • Für Beruf und Forschung
  • 34,77 MB
978-1-119-16899-7 (9781119168997)
1119168996 (1119168996)
weitere Ausgaben werden ermittelt
BRANI VIDAKOVIC, PhD, is a Professor in the School of Industrial and Systems Engineering (ISyE) at Georgia Institute of Technology and Department of Biomedical Engineering at Georgia Institute of Technology/Emory University. Dr. Vidakovic is a Fellow of the American Statistical Association, Elected Member of the International Statistical Institute, an Editor-in-Chief of Encyclopedia of Statistical Sciences, Second Edition, and former and current Associate Editor of several leading journals in the field of statistics.
  • "ENGINEERING BIOSTATISTICS"
  • "Contents"
  • "Preface"
  • "1 Introduction "
  • "Chapter References"
  • "2 The Sample and Its Properties "
  • "2.1 Introduction "
  • "2.2 A MATLAB Session on Univariate Descriptive Statistics "
  • "2.3 Location Measures "
  • "2.4 Variability Measures "
  • "2.5 Ranks "
  • "2.6 Displaying Data "
  • "2.7 Multidimensional Samples: Fisherâ??s Iris Data and Body Fat Data "
  • "2.8 Multivariate Samples and Their Summaries* "
  • "2.9 Principal Components of Data "
  • "2.10 Visualizing Multivariate Data "
  • "2.11 Observations as Time Series "
  • "2.12 About Data Types "
  • "2.13 Big Data Paradigm "
  • "2.14 Exercises "
  • "Chapter References"
  • "3 Probability, Conditional Probability, and Bayesâ?? Rule "
  • "3.1 Introduction "
  • "3.2 Events and Probability "
  • "3.3 Odds "
  • "3.4 Venn Diagrams* "
  • "3.5 Counting Principles* "
  • "3.6 Conditional Probability and Independence of Events "
  • "3.6.1 Conditioning and Product Rule "
  • "3.6.2 Pairwise and Global Independence "
  • "3.7 Total Probability "
  • "3.8 Reassessing Probabilities: Bayesâ?? Rule "
  • "3.9 Bayesian Networks* "
  • "3.10 Exercises "
  • "Chapter References"
  • "4 Sensitivity, Specificity, and Relatives "
  • "4.1 Introduction "
  • "4.2 Notation "
  • "4.3 Combining Two or More Tests "
  • "4.4 ROC Curves "
  • "4.5 Exercises "
  • "Chapter References"
  • "5 Random Variables "
  • "5.1 Introduction "
  • "5.2 Discrete Random Variables "
  • "5.3 Some Standard Discrete Distributions "
  • "5.3.1 Discrete Uniform Distribution "
  • "5.3.2 Bernoulli and Binomial Distributions "
  • "5.3.3 Hypergeometric Distribution "
  • "5.3.4 Poisson Distribution "
  • "5.3.5 Geometric Distribution "
  • "5.3.6 Negative Binomial Distribution "
  • "5.3.7 Multinomial Distribution "
  • "5.3.8 Quantiles "
  • "5.4 Continuous Random Variables "
  • "5.4.1 Joint Distribution of Two Continuous Random Variables "
  • "5.4.2 Conditional Expectation* "
  • "5.5 Some Standard Continuous Distributions "
  • "5.5.1 Uniform Distribution "
  • "5.5.2 Exponential Distribution "
  • "5.5.3 Normal Distribution "
  • "5.5.4 Gamma Distribution "
  • "5.5.5 Inverse Gamma Distribution "
  • "5.5.6 Beta Distribution "
  • "5.5.7 Double Exponential Distribution "
  • "5.5.8 Logistic Distribution "
  • "5.5.9 Weibull Distribution "
  • "5.5.10 Pareto Distribution "
  • "5.5.11 Dirichlet Distribution "
  • "5.6 Random Numbers and Probability Tables "
  • "5.7 Transformations of Random Variables* "
  • "5.8 Mixtures* "
  • "5.9 Markov Chains* "
  • "5.10 Exercises "
  • "Chapter References"
  • "6 Normal Distribution "
  • "6.1 Introduction "
  • "6.2 Normal Distribution "
  • "6.2.1 Sigma Rules "
  • "6.2.2 Bivariate Normal Distribution* "
  • "6.3 Examples with a Normal Distribution "
  • "6.4 Combining Normal Random Variables "
  • "6.5 Central Limit Theorem "
  • "6.6 Distributions Related to Normal "
  • "6.6.1 Chi-square Distribution "
  • "6.6.2 t-Distribution "
  • "6.6.3 Cauchy Distribution "
  • "6.6.4 F-Distribution "
  • "6.6.5 Noncentral Ï?2, t, and F Distributions"
  • "6.6.6 Lognormal Distribution "
  • "6.7 Delta Method and Variance-Stabilizing Transformations* "
  • "6.8 Exercises "
  • "Chapter References"
  • "7 Point and Interval Estimators "
  • "7.1 Introduction "
  • "7.2 Moment-Matching and Maximum Likelihood Estimators "
  • "7.3 Unbiasedness and Consistency of Estimators "
  • "7.4 Estimation of a Mean, Variance, and Proportion "
  • "7.4.1 Point Estimation of Mean "
  • "7.4.2 Point Estimation of Variance "
  • "7.4.3 Point Estimation of Population Proportion "
  • "7.5 Confidence Intervals "
  • "7.5.1 Confidence Intervals for the Normal Mean "
  • "7.5.2 Confidence Interval for the Normal Variance "
  • "7.5.3 Confidence Intervals for the Population Proportion "
  • "7.5.4 Confidence Intervals for Proportions When X = 0 "
  • "7.5.5 Designing the Sample Size with Confidence Intervals"
  • "7.6 Prediction and Tolerance Intervals* "
  • "7.7 Confidence Intervals for Quantiles* "
  • "7.8 Confidence Intervals for the Poisson Rate* "
  • "7.9 Exercises "
  • "Chapter References"
  • "8 Bayesian Approach to Inference "
  • "8.1 Introduction "
  • "8.2 Ingredients for Bayesian Inference "
  • "8.3 Conjugate Priors "
  • "8.4 Point Estimation "
  • "8.5 Prior Elicitation "
  • "8.6 Bayesian Computation and Use of WinBUGS "
  • "8.7 Bayesian Interval Estimation: Credible Sets "
  • "8.8 Learning by Bayesâ?? Theorem "
  • "8.9 Bayesian Prediction "
  • "8.10 Consensus Means* "
  • "8.11 Exercises "
  • "Chapter References"
  • "9 Testing Statistical Hypotheses "
  • "9.1 Introduction "
  • "9.2 Classical Testing Problem "
  • "9.2.1 Choice of Null Hypothesis "
  • "9.2.2 Test Statistic, Rejection Regions, Decisions, and Errors in Testing "
  • "9.2.3 Power of the Test "
  • "9.2.4 Fisherian Approach: p-Values "
  • "9.3 Bayesian Approach to Testing "
  • "9.4 Criticism and Calibration of p-Values* "
  • "9.5 Testing the Normal Mean "
  • "9.5.1 z-Test "
  • "9.5.2 Power Analysis of a z-Test "
  • "9.5.3 Testing a Normal Mean When the Variance Is Not Known: t-Test "
  • "9.5.4 Power Analysis of a t-Test "
  • "9.6 Testing the Multivariate Normal Mean* "
  • "9.6.1 T-Square Test "
  • "9.6.2 Test for Symmetry "
  • "9.7 Testing the Normal Variances "
  • "9.8 Testing the Proportion "
  • "9.8.1 Exact Test for Population Proportions "
  • "9.8.2 Bayesian Test for Population Proportions "
  • "9.9 Multiplicity in Testing, Bonferroni Correction, and False Discovery Rate "
  • "9.10 Exercises "
  • "Chapter References"
  • "10 Two Samples "
  • "10.1 Introduction "
  • "10.2 Means and Variances in Two Independent Normal Populations "
  • "10.2.1 Confidence Interval for the Difference of Means "
  • "10.2.2 Power Analysis for Testing Two Means "
  • "10.2.3 More Complex Two-Sample Designs "
  • "10.2.4 A Bayesian Test for Two Normal Means "
  • "10.3 Testing the Equality of Normal Means When Samples Are Paired "
  • "10.3.1 Sample Size in Paired t-Test "
  • "10.3.2 Difference-in-Differences (DiD) Tests "
  • "10.4 Two Multivariate Normal Means* "
  • "10.4.1 Confidence Intervals for Arbitrary Linear Combinations of Mean Differences "
  • "10.4.2 Profile Analysis With Two Independent Groups* "
  • "10.4.3 Paired Multivariate Samples* "
  • "10.5 Two Normal Variances "
  • "10.6 Comparing Two Proportions "
  • "10.7 Risk Differences, Risk Ratios, and Odds Ratios "
  • "10.7.1 Risk Differences "
  • "10.7.2 Risk Ratio "
  • "10.7.3 Odds Ratios "
  • "10.7.4 Two Proportions from a Single Sample "
  • "10.8 Two Poisson Rates* "
  • "10.9 Equivalence Tests* "
  • "10.10 Exercises "
  • "Chapter References"
  • "11 ANOVA and Elements of Experimental Design "
  • "11.1 Introduction "
  • "11.2 One-Way ANOVA "
  • "11.2.1 ANOVA Table and Rationale for F-Test "
  • "11.2.2 Testing the Assumption of Equal Population Variances"
  • "11.2.3 The Null Hypothesis Is Rejected. What Next? "
  • "11.2.4 Bayesian Solution "
  • "11.2.5 Fixed- and Random-Effect ANOVA "
  • "11.3 Welchâ??s ANOVA* "
  • "11.4 Two-Way ANOVA and Factorial Designs "
  • "11.4.1 Two-Way ANOVA: One Observation per Cell "
  • "11.5 Blocking "
  • "11.6 Repeated Measures Design "
  • "11.6.1 ANOVA Table for Repeated Measures "
  • "11.6.2 Sphericity Tests "
  • "11.7 Nested Designs* "
  • "11.8 Power Analysis in ANOVA "
  • "11.9 Functional ANOVA* "
  • "11.10 Analysis of Means (ANOM)* "
  • "11.11 Gauge R&R ANOVA* "
  • "11.12 Testing Equality of Several Proportions "
  • "11.13 Testing the Equality of Several Poisson Means* "
  • "11.14 Exercises "
  • "Chapter References"
  • "12 Models for Tables "
  • "12.1 Introduction "
  • "12.2 Contingency Tables: Testing for Independence "
  • "12.2.1 Measuring Association in Contingency Tables "
  • "12.2.2 Power Analysis for Contingency Tables "
  • "12.2.3 Cohenâ??s Kappa "
  • "12.3 Three-Way Tables "
  • "12.4 Fisherâ??s Exact Test "
  • "12.5 Stratified Tables: Mantelâ??Haenszel Test "
  • "12.5.1 Testing Conditional Independence or Homogeneity"
  • "12.5.2 Odds Ratio from Stratified Tables "
  • "12.6 Paired Tables: McNemarâ??s Test "
  • "12.7 Risk Differences, Risk Ratios, and Odds Ratios for Paired Tables "
  • "12.7.1 Risk Differences "
  • "12.7.2 Risk Ratios "
  • "12.7.3 Odds Ratios "
  • "12.7.4 Liddellâ??s Procedure "
  • "12.7.5 Garth Test* "
  • "12.7.6 Stuartâ??Maxwell Test* "
  • "12.7.7 Cochranâ??s Q Test* "
  • "12.8 Exercises "
  • "Chapter References"
  • "13 Correlation "
  • "13.1 Introduction "
  • "13.2 The Pearson Coefficient of Correlation "
  • "13.2.1 Inference About Ï? "
  • "13.2.2 Bayesian Inference for Correlation Coefficients "
  • "13.3 Spearmanâ??s Coefficient of Correlation "
  • "13.4 Kendallâ??s Tau "
  • "13.5 Cum hoc ergo propter hoc "
  • "13.6 Exercises "
  • "Chapter References"
  • "14 Regression "
  • "14.1 Introduction "
  • "14.2 Simple Linear Regression "
  • "14.3 Inference in Simple Linear Regression "
  • "14.3.1 Inference about the Slope Parameter "
  • "14.3.2 Inference about the Intercept Parameter "
  • "14.3.3 Inference about the Variance "
  • "14.3.4 Inference about the Mean Response "
  • "14.3.5 Inference about a New Response "
  • "14.4 Calibration "
  • "14.5 Testing the Equality of Two Slopes* "
  • "14.6 Multiple Regression"
  • "14.6.1 Matrix Notation"
  • "14.6.2 Sums of Squares and an ANOVA Table"
  • "14.6.3 Inference About Regression Parameters and Responses"
  • "14.7 Diagnostics in Multiple Regression "
  • "14.7.1 Residual Analysis and Influence "
  • "14.7.2 Multicollinearity "
  • "14.7.3 Variable Selection in Regression"
  • "14.7.4 Bayesian Model Selection in Multiple Regression"
  • "14.8 Sample Size in Regression "
  • "14.9 Linear Regression That Is Nonlinear in Predictors "
  • "14.10 Errors-in-Variables Linear Regression* "
  • "14.11 Analysis of Covariance "
  • "14.11.1 Sample Size in ANCOVA "
  • "14.11.2 Bayesian Approach to ANCOVA "
  • "14.12 Exercises "
  • "Chapter References"
  • "15 Regression for Binary and Count Data "
  • "15.1 Introduction "
  • "15.2 Logistic Regression "
  • "15.2.1 Fitting Logistic Regression "
  • "15.2.2 Assessing the Logistic Regression Fit "
  • "15.2.3 Probit and Complementary Log-Log Links "
  • "15.3 Poisson Regression "
  • "15.4 Log-linear Models "
  • "15.5 Exercises "
  • "Chapter References"
  • "16 Inference for Censored Data and Survival Analysis "
  • "16.1 Introduction "
  • "16.2 Definitions "
  • "16.3 Inference with Censored Observations "
  • "16.3.1 Parametric Approach "
  • "16.3.2 Nonparametric Approach: Kaplanâ??Meier or Productâ??Limit Estimator "
  • "16.3.3 Comparing Survival Curves "
  • "16.4 The Cox Proportional Hazards Model "
  • "16.5 Bayesian Approach "
  • "16.6 Survival Analysis in WinBUGS "
  • "16.7 Exercises "
  • "Chapter References"
  • "17 Goodness-of-Fit Tests "
  • "17.1 Introduction "
  • "17.2 Probability Plots "
  • "17.2.1 Qâ??Q Plots "
  • "17.2.2 Pâ??P Plots "
  • "17.2.3 Poissonness Plots "
  • "17.3 Pearsonâ??s Chi-Square Test "
  • "17.4 Kolmogorovâ??Smirnov Tests "
  • "17.4.1 Kolmogorovâ??s Test "
  • "17.4.2 Smirnovâ??s Test to Compare Two Distributions "
  • "17.5 Cramérâ??von Mises and Watsonâ??s Tests* "
  • "17.6 Rosenblattâ??s Test* "
  • "17.7 Moranâ??s Test* "
  • "17.8 Departures from Normality "
  • "17.9 Ellimination of Unknown Parameters by Transformations "
  • "17.10 Exercises "
  • "Chapter References"
  • "18 Distribution-Free Methods "
  • "18.1 Introduction "
  • "18.2 Sign Test "
  • "18.3 Wilcoxon Signed-Rank Test "
  • "18.4 Wilcoxon Sum-Rank and Mannâ??Whitney Tests "
  • "18.5 Kruskalâ??Wallis Test "
  • "18.6 Friedmanâ??s Test "
  • "18.7 Resampling Methods "
  • "18.7.1 The Jackknife "
  • "18.7.2 Bootstrap "
  • "18.7.3 Bootstrap Versions of Some Popular Tests "
  • "18.7.4 Randomization and Permutation Tests "
  • "18.8 Exercises "
  • "Chapter References"
  • "19 Bayesian Inference Using Gibbs Sampling â?? BUGS Project "
  • "19.1 Introduction "
  • "19.2 Step-by-Step Session "
  • "19.3 Built-in Functions and Common Distributions in WinBUGS"
  • "19.4 MATBUGS: A MATLAB Interface to WinBUGS "
  • "19.5 Exercises "
  • "Chapter References"
  • "Index"

Dateiformat: PDF
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat PDF zeigt auf jeder Hardware eine Buchseite stets identisch an. Daher ist eine PDF auch für ein komplexes Layout geeignet, wie es bei Lehr- und Fachbüchern verwendet wird (Bilder, Tabellen, Spalten, Fußnoten). Bei kleinen Displays von E-Readern oder Smartphones sind PDF leider eher nervig, weil zu viel Scrollen notwendig ist. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

104,99 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
PDF mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung dieser Web-Seiten erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok