Statistical Methods for Data Analysis

With Applications in Particle Physics

Paperback Engels 2023 3e druk 9783031199332
€ 84,99
Levertijd ongeveer 9 werkdagen
Gratis verzonden

Samenvatting

This third edition expands on the original material. Large portions of the text have been reviewed and clarified. More emphasis is devoted to machine learning including more modern concepts and examples. This book provides the reader with the main concepts and tools needed to perform statistical analyses of experimental data, in particular in the field of high-energy physics (HEP).

It starts with an introduction to probability theory and basic statistics, mainly intended as a refresher from readers’ advanced undergraduate studies, but also to help them clearly distinguish between the Frequentist and Bayesian approaches and interpretations in subsequent applications. Following, the author discusses Monte Carlo methods with emphasis on techniques like Markov Chain Monte Carlo, and the combination of measurements, introducing the best linear unbiased estimator. More advanced concepts and applications are gradually presented, including unfolding and regularization procedures, culminating in the chapter devoted to discoveries and upper limits.

The reader learns through many applications in HEP where the hypothesis testing plays a major role and calculations of look-elsewhere effect are also presented. Many worked-out examples help newcomers to the field and graduate students alike understand the pitfalls involved in applying theoretical concepts to actual data.

Specificaties

ISBN13:9783031199332
Taal:Engels
Bindwijze:paperback
Uitgever:Springer International Publishing
Druk:3

Lezersrecensies

Wees de eerste die een lezersrecensie schrijft!

Inhoudsopgave

<p>Preface to the third edition</p>

<p>Preface to previous edition/s</p><br><p></p>

<p>1 Probability Theory</p>

1.1 Why Probability Matters to a Physicist<p></p>

<p>1.2 The Concept of Probability</p>

1.3 Repeatable and Non-Repeatable Cases<p></p>

<p>1.4 Different Approaches to Probability</p>

1.5 Classical Probability<p></p>

<p>1.6 Generalization to the Continuum</p>

1.7 Axiomatic Probability Definition<p></p>

<p>1.8 Probability Distributions</p>

1.9 Conditional Probability<p></p>

<p>1.10 Independent Events</p>

1.11 Law of Total Probability<p></p>

<p>1.12 Statistical Indicators: Average, Variance and Covariance</p> 1.13 Statistical Indicators for a Finite Sample<p></p>

<p>1.14 Transformations of Variables</p>

1.15 The Law of Large Numbers<p></p>

<p>1.16 Frequentist Definition of Probability</p>

References<p></p>

<p>&nbsp;</p> 2 Discrete Probability Distributions<p></p>

<p>2.1 The Bernoulli Distribution</p>

2.2 The Binomial Distribution<p></p>

<p>2.3 The Multinomial Distribution</p>

2.4 The Poisson Distribution<p></p>

<p>References</p><br><p></p>

<p>3 Probability Distribution Functions</p>

3.1 Introduction<p></p>

<p>3.2 Definition of Probability Distribution Function</p>

3.3 Average and Variance in the Continuous Case<p></p>

<p>3.4 Mode, Median, Quantiles</p>

3.5 Cumulative Distribution<p></p>

<p>3.6 Continuous Transformations of Variables</p>

3.7 Uniform Distribution<p></p>

<p>3.8 Gaussian Distribution</p>

3.9 X^2 Distribution<p></p>

<p>3.10 Log Normal Distribution</p>

3.11 Exponential Distribution<p></p><p></p><p>3.12 Other Distributions Useful in Physics</p>

<p>3.13 Central Limit Theorem</p>

<p>3.14 Probability Distribution Functions in More than One Dimension</p>

<p>3.15 Gaussian Distributions in Two or More Dimensions</p>

<p>References</p>

<p>&nbsp;</p>

<p>4 Bayesian Approach to Probability</p>

4.1 Introduction<p></p>

<p>4.2 Bayes’ Theorem</p>

4.3 Bayesian Probability Definition<p></p>

<p>4.4 Bayesian Probability and Likelihood Functions</p>

4.5 Bayesian Inference<p></p>

<p>4.6 Bayes Factors</p>

4.7 Subjectiveness and Prior Choice<p></p>

<p>4.8 Jeffreys’ Prior</p>

4.9 Reference priors<p></p>

<p>4.10 Improper Priors</p>

4.11 Transformations of Variables and Error Propagation<p></p>

<p>References</p><br><p></p>

<p>5 Random Numbers and Monte Carlo Methods</p>

5.1 Pseudorandom Numbers<p></p>

<p>5.2 Pseudorandom Generators Properties</p>

5.3 Uniform Random Number Generators<p></p>

<p>5.4 Discrete Random Number Generators</p>

5.5 Nonuniform Random Number Generators<p></p>

<p>5.6 Monte Carlo Sampling</p>

5.7 Numerical Integration with Monte Carlo Methods<p></p>

<p>5.8 Markov Chain Monte Carlo</p>

References<p></p>

<p>&nbsp;</p>

6 Parameter Estimate<p></p>

<p>6.1 Introduction</p>

6.2 Inference<p></p>

<p>6.3 Parameters of Interest</p>

6.4 Nuisance Parameters<p></p>

<p>6.5 Measurements and Their Uncertainties</p>

6.6 Frequentist vs Bayesian Inference<p></p>

<p>6.7 Estimators</p>

6.8 Properties of Estimators<p></p>

<p>6.9 Binomial Distribution for Efficiency Estimate</p>

6.10 Maximum Likelihood Method<p></p>

<p>6.11 Errors with the Maximum Likelihood Method</p>

6.12 Minimum X^2 and Least-Squares Methods<p></p>

<p>6.13 Binned Data Samples</p>

6.14 Error Propagation<p></p>

<p>6.15 Treatment of Asymmetric Errors</p>

References<p></p><p><br></p>7 Combining Measurements<p></p><p></p><p>7.1 Introduction</p><p>7.2 Simultaneous Fits and Control Regions</p><p>7.3 Weighted Average</p><p>7.4 X^2 in n Dimensions</p><p>7.5 The Best Linear Unbiased Estimator</p><p>References</p><p>&nbsp;</p>8 Confidence Intervals<p></p><p>8.1 Introduction</p>8.2 Neyman Confidence Intervals<p></p><p>8.3 Binomial Intervals</p>8.4 The Flip-Flopping Problem<p></p><p>8.5 The Unified Feldman–Cousins Approach</p>References<p></p><p>&nbsp;</p>9 Convolution and Unfolding<p></p><p>9.1 Introduction</p>9.2 Convolution<p></p><p>9.3 Unfolding by Inversion of the Response Matrix</p>9.4 Bin-by-Bin Correction Factors<p></p><p>9.5 Regularized Unfolding</p>9.6 Iterative Unfolding<p></p><p>9.7 Other Unfolding Methods</p>9.8 Software Implementations<p></p><p>9.9 Unfolding in More Dimensions</p>References<div><br><p></p><p>10 Hypothesis Tests</p>10.1 Introduction<p></p><p>10.2 Test Statistic</p>10.3 Type I and Type II Errors<p></p><p>10.4 Fisher’s Linear Discriminant</p>10.5 The Neyman–Pearson Lemma<p></p><p>10.6 Projective Likelihood Ratio Discriminant</p>10.7 Kolmogorov–Smirnov Test<p></p><p>10.8 Wilks’ Theorem</p>10.9 Likelihood Ratio in the Search for a New Signal<p></p><p>References</p>&nbsp;<p></p><p>11 Machine Learning</p>11.1 Supervised and Unsupervised Learning<p></p><p>11.2 Terminology</p>11.3 Machine Learning Classification from a Statistical Point of View<p></p><p>11.4 Bias-Variance tradeo</p>11.5 Overtraining<p></p><p>11.6 Artificial Neural Networks</p><p> </p><p>11.7 Deep Learning</p><p>11.8 Convolutional Neural Networks<br></p><p></p><p>11.9 Boosted Decision Trees</p>11.10 Multivariate Analysis Implementations<p></p><p>References</p>&nbsp;<p></p><p>12 Discoveries and Upper Limits</p>12.1 Searches for New Phenomena: Discovery and Upper Limits<p></p><p>12.2 Claiming a Discovery</p>12.3 Excluding a Signal Hypothesis<p></p><p>12.4 Combined Measurements and Likelihood Ratio</p>12.5 Definitions of Upper Limit<p></p><p>12.6 Bayesian Approach</p>12.7 Frequentist Upper Limits<p></p><p>12.8 Modified Frequentist Approach: the CLs Method</p>12.9 Presenting Upper Limits: the Brazil Plot<p></p><p>12.10 Nuisance Parameters and Systematic Uncertainties</p>12.11 Upper Limits Using the Profile Likelihood<p></p><p>12.12 Variations of the Profile-Likelihood Test Statistic</p>12.13 The Look Elsewhere Effect<p></p><p>References</p>&nbsp;<p></p><p> Index<br></p></div>

Managementboek Top 100

€ 84,99
Levertijd ongeveer 9 werkdagen
Gratis verzonden

Rubrieken

    Personen

      Trefwoorden

        Statistical Methods for Data Analysis