Coverart for item
The Resource Bayesian Methods for the Physical Sciences : Learning from Examples in Astronomy and Physics

Bayesian Methods for the Physical Sciences : Learning from Examples in Astronomy and Physics

Label
Bayesian Methods for the Physical Sciences : Learning from Examples in Astronomy and Physics
Title
Bayesian Methods for the Physical Sciences
Title remainder
Learning from Examples in Astronomy and Physics
Creator
Contributor
Subject
Language
eng
Summary
Statistical literacy is critical for the modern researcher in Physics and Astronomy. This book empowers researchers in these disciplines by providing the tools they will need to analyze their own data. Chapters in this book provide a statistical base from which to approach new problems, including numerical advice and a profusion of examples. The examples are engaging analyses of real-world problems taken from modern astronomical research. The examples are intended to be starting points for readers as they learn to approach their own data and research questions. Acknowledging that scientific progress now hinges on the availability of data and the possibility to improve previous analyses, data and code are distributed throughout the book. The JAGS symbolic language used throughout the book makes it easy to perform Bayesian analysis and is particularly valuable as readers may use it in a myriad of scenarios through slight modifications. This book is comprehensive, well written, and will surely be regarded as a standard text in both astrostatistics and physical statistics. Joseph M. Hilbe, President, International Astrostatistics Association, Professor Emeritus, University of Hawaii, and Adjunct Professor of Statistics, Arizona State University
Member of
Cataloging source
MiAaPQ
Literary form
non fiction
Nature of contents
dictionaries
Series statement
Springer Series in Astrostatistics Ser.
Series volume
v.4
Bayesian Methods for the Physical Sciences : Learning from Examples in Astronomy and Physics
Label
Bayesian Methods for the Physical Sciences : Learning from Examples in Astronomy and Physics
Link
http://libproxy.rpi.edu/login?url=https://ebookcentral.proquest.com/lib/rpi/detail.action?docID=2095765
Publication
Copyright
Related Contributor
Related Location
Related Agents
Related Authorities
Related Subjects
Related Items
Carrier category
online resource
Carrier category code
cr
Carrier MARC source
rdacarrier
Color
multicolored
Content category
text
Content type code
txt
Content type MARC source
rdacontent
Contents
  • Preface -- Contents -- 1 Recipes for a Good Statistical Analysis -- 2 A Bit of Theory -- 2.1 Axiom 1: Probabilities Are in the Range Zero to One -- 2.2 Axiom 2: When a Probability Is Either Zero or One -- 2.3 Axiom 3: The Sum, or Marginalization, Axiom -- 2.4 Product Rule -- 2.5 Bayes Theorem -- 2.6 Error Propagation -- 2.7 Bringing It All Home -- 2.8 Profiling Is Not Marginalization -- 2.9 Exercises -- References -- 3 A Bit of Numerical Computation -- 3.1 Some Technicalities -- 3.2 How to Sample from a Generic Function -- References -- 4 Single Parameter Models -- 4.1 Step-by-Step Guide for Building a Basic Model -- 4.1.1 A Little Bit of (Science) Background -- 4.1.2 Bayesian Model Specification -- 4.1.3 Obtaining the Posterior Distribution -- 4.1.4 Bayesian Point and Interval Estimation -- 4.1.5 Checking Chain Convergence -- 4.1.6 Model Checking and Sensitivity Analysis -- 4.1.7 Comparison with Older Analyses -- 4.2 Other Useful Distributions with One Parameter -- 4.2.1 Measuring a Rate: Poisson -- 4.2.2 Combining Two or More (Poisson) Measurements -- 4.2.3 Measuring a Fraction: Binomial -- 4.3 Exercises -- References -- 5 The Prior -- 5.1 Conclusions Depend on the Prior {u2026} -- 5.1.1 {u2026} Sometimes a Lot: The Malmquist-Eddington Bias -- 5.1.2 {u2026} by Lower Amounts with Increasing Data Quality -- 5.1.3 {u2026} but Eventually Becomes Negligible -- 5.1.4 {u2026} and the Precise Shape of the Prior OftenDoes Not Matter -- 5.2 Where to Find Priors -- 5.3 Why There Are So Many Uniform Priors in this Book? -- 5.4 Other Examples on the Influence of Priors on Conclusions -- 5.4.1 The Important Role of the Prior in the Determination of the Mass of the Most Distant Known Galaxy Cluster -- 5.4.2 The Importance of Population Gradients for Photometric Redshifts -- 5.5 Exercises -- References -- 6 Multi-parameters Models -- 6.1 Common Simple Problems -- 6.1.1 Location and Spread
  • 6.1.2 The Source Intensity in the Presence of a Background -- 6.1.3 Estimating a Fraction in the Presence of a Background -- 6.1.4 Spectral Slope: Hardness Ratio -- 6.1.5 Spectral Shape -- 6.2 Mixtures -- 6.2.1 Modeling a Bimodal Distribution: The Case of Globular Cluster Metallicity -- 6.2.2 Average of Incompatible Measurements -- 6.3 Advanced Analysis -- 6.3.1 Source Intensity with Over-Poisson BackgroundFluctuations -- 6.3.2 The Cosmological Mass Fraction Derived from the Cluster's Baryon Fraction -- 6.3.3 Light Concentration in the Presence of a Background -- 6.3.4 A Complex Background Modeling for Geo-Neutrinos -- 6.3.4.1 An Initial Modeling of the Background -- 6.3.4.2 Discriminating Natural from Human-Induced Neutrinos -- 6.3.4.3 Improving Detection of Geo-Neutrinos -- 6.3.4.4 Concluding Remarks -- 6.3.5 Upper Limits from Counting Experiments -- 6.3.5.1 Zero Observed Events -- 6.3.5.2 Non-zero Events -- 6.4 Exercises -- References -- 7 Non-random Data Collection -- 7.1 The General Case -- 7.2 Sharp Selection on the Value -- 7.3 Sharp Selection on the Value, Mixture of Gaussians: Measuring the Gravitational Redshift -- 7.4 Sharp Selection on the True Value -- 7.5 Probabilistic Selection on the True Value -- 7.6 Sharp Selection on the Observed Value, Mixture of Gaussians -- 7.7 Numerical Implementation of the Models -- 7.7.1 Sharp Selection on the Value -- 7.7.2 Sharp Selection on the True Value -- 7.7.3 Probabilistic Selection on the True Value -- 7.7.4 Sharp Selection on the Observed Value, Mixture of Gaussians -- 7.8 Final Remarks -- Reference -- 8 Fitting Regression Models -- 8.1 Clearing Up Some Misconceptions -- 8.1.1 Pay Attention to Selection Effects -- 8.1.2 Avoid Fishing Expeditions -- 8.1.3 Do Not Confuse Prediction with Parameter Estimation -- 8.1.3.1 Prediction and Parameter Estimation Differ!
  • 8.1.3.2 Direct and Inverse Relations also Differ -- 8.1.3.3 Summary -- 8.2 Non-linear Fit with No Error on Predictor and No Spread: Efficiency and Completeness -- 8.3 Fit with Spread and No Errors on Predictor: Varying Physical Constants? -- 8.4 Fit with Errors and Spread: The Magorrian Relation -- 8.5 Fit with More Than One Predictor and a Complex Link: Star Formation Quenching -- 8.6 Fit with Upper and Lower Limits: The Optical-to-X Flux Ratio -- 8.7 Fit with An Important Data Structure: TheMass-Richness Scaling -- 8.8 Fit with a Non-ignorable Data Collection -- 8.9 Fit Without Anxiety About Non-random Data Collection -- 8.10 Prediction -- 8.11 A Meta-Analysis: Combined Fit of Regressions with Different Intrinsic Scatter -- 8.12 Advanced Analysis -- 8.12.1 Cosmological Parameters from SNIa -- 8.12.2 The Enrichment History of the ICM -- 8.12.2.1 Enrichment History -- 8.12.2.2 Intrinsic Scatter -- 8.12.2.3 Controlling for Temperature T -- 8.12.2.4 Abundances Systematics -- 8.12.2.5 T and Fe Abundance Likelihood -- 8.12.2.6 Priors -- 8.12.2.7 Results -- 8.12.3 The Enrichment History After Binning by Redshift -- 8.12.4 With An Over-Poissons Spread -- 8.13 Exercises -- References -- 9 Model Checking and Sensitivity Analysis -- 9.1 Sensitivity Analysis -- 9.1.1 Check Alternative Prior Distributions -- 9.1.2 Check Alternative Link Functions -- 9.1.3 Check Alternative Distributional Assumptions -- 9.1.4 Prior Sensitivity Summary -- 9.2 Model Checking -- 9.2.1 Overview -- 9.2.2 Start Simple: Visual Inspection of Real and Simulated Data and of Their Summaries -- 9.2.3 A Deeper Exploration: Using Measures of Discrepancy -- 9.2.4 Another Deep Exploration -- 9.3 Summary -- References -- 10 Bayesian vs Simple Methods -- 10.1 Conceptual Differences -- 10.2 Maximum Likelihood -- 10.2.1 Average vs. Maximum Likelihood -- 10.2.2 Small Samples
  • 10.3 Robust Estimates of Location and Scale -- 10.3.1 Bayes Has a Lower Bias -- 10.3.2 Bayes Is Fairer and Has Less Noisy Errors -- 10.4 Comparison of Fitting Methods -- 10.4.1 Fitting Methods Generalities -- 10.4.2 Regressions Without Intrinsic Scatter -- 10.4.2.1 Preamble: Restating the Obvious -- 10.4.2.2 Testing How Fitting Models Perform for a Regression Without Intrinsic Scatter -- 10.4.3 One More Comparison, with Different Data Structures -- 10.5 Summary and Experience of a Former Non-BayesianAstronomer -- References -- Appendix A Probability Distributions -- A.1 Discrete Distributions -- A.1.1 Bernoulli -- A.1.2 Binomial -- A.1.3 Poisson -- A.2 Continuous Distributions -- A.2.1 Gaussian or Normal -- A.2.2 Beta -- A.2.3 Exponential -- A.2.4 Gamma and Schechter -- A.2.5 Lognormal -- A.2.6 Pareto or Power Law -- A.2.7 Central Student-t -- A.2.8 Uniform -- A.2.9 Weibull -- Appendix B The Third Axiom of Probability, Conditional Probability, Independence and Conditional Independence -- B.1 The Third Axiom of Probability -- B.2 Conditional Probability -- B.3 Independence and Conditional Independence
http://library.link/vocab/cover_art
https://contentcafe2.btol.com/ContentCafe/Jacket.aspx?Return=1&Type=S&Value=9783319152875&userID=ebsco-test&password=ebsco-test
Dimensions
unknown
http://library.link/vocab/discovery_link
{'f': 'http://opac.lib.rpi.edu/record=b4383236'}
Extent
1 online resource (245 pages)
Form of item
online
Isbn
9783319152875
Media category
computer
Media MARC source
rdamedia
Media type code
c
Sound
unknown sound
Specific material designation
remote

Library Locations

    • Folsom LibraryBorrow it
      110 8th St, Troy, NY, 12180, US
      42.729766 -73.682577
Processing Feedback ...