0% found this document useful (0 votes)
53 views67 pages

Stat 1

This document provides information about a statistical data analysis course taught by Glen Cowan at the University of London. The course covers topics like probability, random variables, statistical tests, parameter estimation, and examples of Bayesian analysis. It includes 9 problem sheets that students complete, with C++ computing problems for some sheets. For MSc/MSci students, problem sheets count for 20% of their grade and they take a written exam. PhD students are assessed entirely through coursework with no exam.

Uploaded by

Habib Mrad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views67 pages

Stat 1

This document provides information about a statistical data analysis course taught by Glen Cowan at the University of London. The course covers topics like probability, random variables, statistical tests, parameter estimation, and examples of Bayesian analysis. It includes 9 problem sheets that students complete, with C++ computing problems for some sheets. For MSc/MSci students, problem sheets count for 20% of their grade and they take a written exam. PhD students are assessed entirely through coursework with no exam.

Uploaded by

Habib Mrad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 67

Statistical Data Analysis

2017/18
London Postgraduate Lectures on Particle Physics;
University of London MSci course PH4515

Glen Cowan
Physics Department
Royal Holloway, University of London
g.cowan@rhul.ac.uk
www.pp.rhul.ac.uk/~cowan

Course web page (moodle links to here):


www.pp.rhul.ac.uk/~cowan/stat_course.html

G. Cowan Statistical Data Analysis / Stat 1 1


Course structure
The main lectures are from 3:00 to 5:00 and will cover
statistical data analysis.
There is no assessed element in computing per se, although the
coursework will use C++.
Through week 6 the hour from 5:00 to 6:00 will be a crash
course in C++ (non-assessed, attend as needed).
From week 7, the hour from 5:00 to 6:00 will be used to discuss
the coursework and go over additional examples.

G. Cowan Statistical Data Analysis / Stat 1 2


Coursework, exams, etc.
9 problem sheets, provisionally due weeks 3 through 11.
Problems will only cover statistical data analysis, but for some
problem sheets you will write simple C++ programs.
Please turn in your problem sheets on paper, Mondays at our
lectures. Please staple the pages and indicate on the sheet your
name, College and degree programme (PhD, MSc, MSci).
In general email or late submissions are not allowed unless due
to exceptional circumstances and agreed with me.
For MSc/MSci students: problem sheets count 20% of the
mark; written exam at end of year (80%).
For PhD students: assessment entirely through coursework; no
material from this course in exam (~early next year).

G. Cowan Statistical Data Analysis / Stat 1 3


Computing
The coursework includes C++ computing in a linux
environment.
For PhD students, use your own accounts – usual HEP setup
should be OK.
The computing problems require specific software (ROOT and
its class library) – cannot just use e.g. visual C++.
Therefore for MSc/MSci students, you will get an account on
the RHUL linux cluster. You then only need to be able to create
an X-Window on your local machine, and from there you
remotely login to RHUL.
For mac, install XQuartz from xquartz.macosforge.org and
open a terminal window.
For windows, various options, e.g., mobaXterm or cygwin/X
(see course page near bottom “information on computing”).
G. Cowan Statistical Data Analysis / Stat 1 4
Statistical Data Analysis Outline
1 Probability, Bayes’ theorem
2 Random variables and probability densities
3 Expectation values, error propagation
4 Catalogue of pdfs
5 The Monte Carlo method
6 Statistical tests: general concepts
7 Test statistics, multivariate methods
8 Goodness-of-fit tests
9 Parameter estimation, maximum likelihood
10 More maximum likelihood
11 Method of least squares
12 Interval estimation, setting limits
13 Nuisance parameters, systematic uncertainties
14 Examples of Bayesian approach

G. Cowan Statistical Data Analysis / Stat 1 5


Some statistics books, papers, etc.
G. Cowan, Statistical Data Analysis, Clarendon, Oxford, 1998
R.J. Barlow, Statistics: A Guide to the Use of Statistical Methods in
the Physical Sciences, Wiley, 1989
Ilya Narsky and Frank C. Porter, Statistical Analysis Techniques in
Particle Physics, Wiley, 2014.
L. Lyons, Statistics for Nuclear and Particle Physics, CUP, 1986
F. James., Statistical and Computational Methods in Experimental
Physics, 2nd ed., World Scientific, 2006
S. Brandt, Statistical and Computational Methods in Data
Analysis, Springer, New York, 1998 (with program library on CD)
C. Patrignani et al. (Particle Data Group), Review of Particle
Physics, Chin. Phys. C, 40, 100001 (2016); see also pdg.lbl.gov
sections on probability, statistics, Monte Carlo

G. Cowan Statistical Data Analysis / Stat 1 6


Data analysis in particle physics

Observe events of a certain type

Measure characteristics of each event (particle momenta,


number of muons, energy of jets,...)
Theories (e.g. SM) predict distributions of these properties
up to free parameters, e.g., α, GF, MZ, αs, mH, ...
Some tasks of data analysis:
Estimate (measure) the parameters;
Quantify the uncertainty of the parameter estimates;
Test the extent to which the predictions of a theory
are in agreement with the data.
G. Cowan Statistical Data Analysis / Stat 1 7
Dealing with uncertainty
In particle physics there are various elements of uncertainty:

theory is not deterministic


quantum mechanics
random measurement errors
present even without quantum effects
things we could know in principle but don’t
e.g. from limitations of cost, time, ...

We can quantify the uncertainty using PROBABILITY

G. Cowan Statistical Data Analysis / Stat 1 8


A definition of probability
Consider a set S with subsets A, B, ...

Kolmogorov
axioms (1933)

From these axioms we can derive further properties, e.g.

G. Cowan Statistical Data Analysis / Stat 1 9


Conditional probability, independence
Also define conditional probability of A given B (with P(B) ≠ 0):

E.g. rolling dice:

Subsets A, B independent if:

If A, B independent,

N.B. do not confuse with disjoint subsets, i.e.,

G. Cowan Statistical Data Analysis / Stat 1 10


Interpretation of probability
I. Relative frequency
A, B, ... are outcomes of a repeatable experiment

cf. quantum mechanics, particle scattering, radioactive decay...

II. Subjective probability


A, B, ... are hypotheses (statements that are true or false)

• Both interpretations consistent with Kolmogorov axioms.


• In particle physics frequency interpretation often most useful,
but subjective probability can provide more natural treatment of
non-repeatable phenomena:
systematic uncertainties, probability that Higgs boson exists,...
G. Cowan Statistical Data Analysis / Stat 1 11
Bayes’ theorem
From the definition of conditional probability we have,

and

but , so
Bayes’ theorem

First published (posthumously) by the


Reverend Thomas Bayes (1702−1761)

An essay towards solving a problem in the


doctrine of chances, Philos. Trans. R. Soc. 53
(1763) 370; reprinted in Biometrika, 45 (1958) 293.
G. Cowan Statistical Data Analysis / Stat 1 12
The law of total probability B

Consider a subset B of
the sample space S, S

divided into disjoint subsets Ai Ai


such that ∪i Ai = S,
B ∩ Ai


→ law of total probability

Bayes’ theorem becomes

G. Cowan Statistical Data Analysis / Stat 1 13


An example using Bayes’ theorem
Suppose the probability (for anyone) to have a disease D is:
← prior probabilities, i.e.,
before any test carried out

Consider a test for the disease: result is + or -

← probabilities to (in)correctly
identify a person with the disease

← probabilities to (in)correctly
identify a healthy person

Suppose your result is +. How worried should you be?


G. Cowan Statistical Data Analysis / Stat 1 14
Bayes’ theorem example (cont.)
The probability to have the disease given a + result is

← posterior probability

i.e. you’re probably OK!


Your viewpoint: my degree of belief that I have the disease is 3.2%.
Your doctor’s viewpoint: 3.2% of people like this have the disease.

G. Cowan Statistical Data Analysis / Stat 1 15


Frequentist Statistics − general philosophy
In frequentist statistics, probabilities are associated only with
the data, i.e., outcomes of repeatable observations (shorthand: ).
Probability = limiting frequency
Probabilities such as
P (Higgs boson exists),
P (0.117 < αs < 0.121),
etc. are either 0 or 1, but we don’t know which.
The tools of frequentist statistics tell us what to expect, under
the assumption of certain probabilities, about hypothetical
repeated observations.
The preferred theories (models, hypotheses, ...) are those for
which our observations would be considered ‘usual’.

G. Cowan Statistical Data Analysis / Stat 1 16


Bayesian Statistics − general philosophy
In Bayesian statistics, use subjective probability for hypotheses:

probability of the data assuming


hypothesis H (the likelihood) prior probability, i.e.,
before seeing the data

posterior probability, i.e., normalization involves sum


after seeing the data over all possible hypotheses

Bayes’ theorem has an “if-then” character: If your prior


probabilities were π (H), then it says how these probabilities
should change in the light of the data.
No general prescription for priors (subjective!)
G. Cowan Statistical Data Analysis / Stat 1 17
Random variables and probability density functions
A random variable is a numerical characteristic assigned to an
element of the sample space; can be discrete or continuous.
Suppose outcome of experiment is continuous value x

→ f(x) = probability density function (pdf)

x must be somewhere

Or for discrete outcome xi with e.g. i = 1, 2, ... we have

probability mass function

x must take on one of its possible values

G. Cowan Statistical Data Analysis / Stat 1 18


Cumulative distribution function
Probability to have outcome less than or equal to x is

cumulative distribution function

Alternatively define pdf with

G. Cowan Statistical Data Analysis / Stat 1 19


Histograms
pdf = histogram with
infinite data sample,
zero bin width,
normalized to unit area.

G. Cowan Statistical Data Analysis / Stat 1 20


Multivariate distributions
Outcome of experiment charac-
terized by several values, e.g. an
n-component vector, (x1, ... xn)

joint pdf

Normalization:

G. Cowan Statistical Data Analysis / Stat 1 21


Marginal pdf
Sometimes we want only pdf of
some (or one) of the components:

→ marginal pdf

x1, x2 independent if

G. Cowan Statistical Data Analysis / Stat 1 22


Marginal pdf (2)

Marginal pdf ~
projection of joint pdf
onto individual axes.

G. Cowan Statistical Data Analysis / Stat 1 23


Conditional pdf
Sometimes we want to consider some components of joint pdf as
constant. Recall conditional probability:

→ conditional pdfs:

Bayes’ theorem becomes:

Recall A, B independent if

→ x, y independent if
G. Cowan Statistical Data Analysis / Stat 1 24
Conditional pdfs (2)
E.g. joint pdf f(x,y) used to find conditional pdfs h(y|x1), h(y|x2):

Basically treat some of the r.v.s as constant, then divide the joint
pdf by the marginal pdf of those variables being held constant so
that what is left has correct normalization, e.g.,

G. Cowan Statistical Data Analysis / Stat 1 25


Functions of a random variable
A function of a random variable is itself a random variable.
Suppose x follows a pdf f(x), consider a function a(x).
What is the pdf g(a)?

dS = region of x space for which


a is in [a, a+da].
For one-variable case with unique
inverse this is simply

G. Cowan Statistical Data Analysis / Stat 1 26


Functions without unique inverse

If inverse of a(x) not unique,


include all dx intervals in dS
which correspond to da:

Example:

G. Cowan Statistical Data Analysis / Stat 1 27


Functions of more than one r.v.
Consider r.v.s and a function

dS = region of x-space between (hyper)surfaces defined by

G. Cowan Statistical Data Analysis / Stat 1 28


Functions of more than one r.v. (2)
Example: r.v.s x, y > 0 follow joint pdf f(x,y),
consider the function z = xy. What is g(z)?

(Mellin convolution)
G. Cowan Statistical Data Analysis / Stat 1 29
More on transformation of variables
Consider a random vector with joint pdf

Form n linearly independent functions


for which the inverse functions exist.

Then the joint pdf of the vector of functions is

where J is the
Jacobian determinant:

For e.g. integrate over the unwanted components.


G. Cowan Statistical Data Analysis / Stat 1 30
Expectation values
Consider continuous r.v. x with pdf f (x).
Define expectation (mean) value as
Notation (often): ~ “centre of gravity” of pdf.
For a function y(x) with pdf g(y),
(equivalent)

Variance:

Notation:

Standard deviation:
σ ~ width of pdf, same units as x.
G. Cowan Statistical Data Analysis / Stat 1 31
Covariance and correlation
Define covariance cov[x,y] (also use matrix notation Vxy) as

Correlation coefficient (dimensionless) defined as

If x, y, independent, i.e., , then

→ x and y, ‘uncorrelated’

N.B. converse not always true.


G. Cowan Statistical Data Analysis / Stat 1 32
Correlation (cont.)

G. Cowan Statistical Data Analysis / Stat 1 33


Error propagation
Suppose we measure a set of values
and we have the covariances
which quantify the measurement errors in the xi.

Now consider a function

What is the variance of

The hard way: use joint pdf to find the pdf

then from g(y) find V[y] = E[y2] - (E[y])2.

Often not practical, may not even be fully known.

G. Cowan Statistical Data Analysis / Stat 1 34


Error propagation (2)
Suppose we had

in practice only estimates given by the measured

Expand to 1st order in a Taylor series about

To find V[y] we need E[y2] and E[y].

since

G. Cowan Statistical Data Analysis / Stat 1 35


Error propagation (3)

Putting the ingredients together gives the variance of

G. Cowan Statistical Data Analysis / Stat 1 36


Error propagation (4)
If the xi are uncorrelated, i.e., then this becomes

Similar for a set of m functions

or in matrix notation where

G. Cowan Statistical Data Analysis / Stat 1 37


Error propagation (5)
The ‘error propagation’ formulae tell us the y(x)
covariances of a set of functions
σy
in terms of
x
the covariances of the original variables. σx

Limitations: exact only if linear. y(x)


Approximation breaks down if function
nonlinear over a region comparable ?
in size to the σi. x
σx

N.B. We have said nothing about the exact pdf of the xi,
e.g., it doesn’t have to be Gaussian.
G. Cowan Statistical Data Analysis / Stat 1 38
Error propagation − special cases

That is, if the xi are uncorrelated:


add errors quadratically for the sum (or difference),
add relative errors quadratically for product (or ratio).

But correlations can change this completely...

G. Cowan Statistical Data Analysis / Stat 1 39


Error propagation − special cases (2)

Consider with

Now suppose ρ = 1. Then

i.e. for 100% correlation, error in difference → 0.

G. Cowan Statistical Data Analysis / Stat 1 40


Short catalogue of distributions
We will now run through a short catalog of probability functions
and pdfs.
For each (usually) show expectation value, variance,
a plot and discuss some properties and applications.
See also chapter on probability from pdg.lbl.gov
For a more complete catalogue see e.g. the handbook on
statistical distributions by Christian Walck from
www.fysik.su.se/~walck/suf9601.pdf

G. Cowan Statistical Data Analysis / Stat 1 41


Some distributions
Distribution/pdf Example use in HEP
Binomial Branching ratio
Multinomial Histogram with fixed N
Poisson Number of events found
Uniform Monte Carlo method
Exponential Decay time
Gaussian Measurement error
Chi-square Goodness-of-fit
Cauchy Mass of resonance
Landau Ionization energy loss
Beta Prior pdf for efficiency
Gamma Sum of exponential variables
Student’s t Resolution function with adjustable tails
G. Cowan Statistical Data Analysis / Stat 1 42
Binomial distribution
Consider N independent experiments (Bernoulli trials):
outcome of each is ‘success’ or ‘failure’,
probability of success on any given trial is p.
Define discrete r.v. n = number of successes (0 ≤ n ≤ N).

Probability of a specific outcome (in order), e.g. ‘ssfsf’ is

But order not important; there are

ways (permutations) to get n successes in N trials, total


probability for n is sum of probabilities for each permutation.

G. Cowan Statistical Data Analysis / Stat 1 43


Binomial distribution (2)
The binomial distribution is therefore

random parameters
variable

For the expectation value and variance we find:

G. Cowan Statistical Data Analysis / Stat 1 44


Binomial distribution (3)
Binomial distribution for several values of the parameters:

Example: observe N decays of W±, the number n of which are


W→µν is a binomial r.v., p = branching ratio.
G. Cowan Statistical Data Analysis / Stat 1 45
Multinomial distribution
Like binomial but now m outcomes instead of two, probabilities are

For N trials we want the probability to obtain:


n1 of outcome 1,
n2 of outcome 2,

nm of outcome m.
This is the multinomial distribution for

G. Cowan Statistical Data Analysis / Stat 1 46


Multinomial distribution (2)
Now consider outcome i as ‘success’, all others as ‘failure’.
→ all ni individually binomial with parameters N, pi

for all i

One can also find the covariance to be

Example: represents a histogram


with m bins, N total entries, all entries independent.

G. Cowan Statistical Data Analysis / Stat 1 47


Poisson distribution
Consider binomial n in the limit

→ n follows the Poisson distribution:

Example: number of scattering events


n with cross section σ found for a fixed
integrated luminosity, with

G. Cowan Statistical Data Analysis / Stat 1 48


Uniform distribution
Consider a continuous r.v. x with -∞ < x < ∞ . Uniform pdf is:

N.B. For any r.v. x with cumulative distribution F(x),


y = F(x) is uniform in [0,1].

Example: for π0 → γγ, Eγ is uniform in [Emin, Emax], with

G. Cowan Statistical Data Analysis / Stat 1 49


Exponential distribution
The exponential pdf for the continuous r.v. x is defined by:

Example: proper decay time t of an unstable particle


(τ = mean lifetime)

Lack of memory (unique to exponential):


G. Cowan Statistical Data Analysis / Stat 1 50
Gaussian distribution
The Gaussian (normal) pdf for a continuous r.v. x is defined by:

(N.B. often µ, σ2 denote


mean, variance of any
r.v., not only Gaussian.)

Special case: µ = 0, σ2 = 1 (‘standard Gaussian’):

If y ~ Gaussian with µ, σ2, then x = (y - µ) /σ follows φ(x).

G. Cowan Statistical Data Analysis / Stat 1 51


Gaussian pdf and the Central Limit Theorem
The Gaussian pdf is so useful because almost any random
variable that is a sum of a large number of small contributions
follows it. This follows from the Central Limit Theorem:

For n independent r.v.s xi with finite variances σi2, otherwise


arbitrary pdfs, consider the sum

In the limit n → ∞, y is a Gaussian r.v. with

Measurement errors are often the sum of many contributions, so


frequently measured values can be treated as Gaussian r.v.s.
G. Cowan Statistical Data Analysis / Stat 1 52
Central Limit Theorem (2)
The CLT can be proved using characteristic functions (Fourier
transforms), see, e.g., SDA Chapter 10.
For finite n, the theorem is approximately valid to the
extent that the fluctuation of the sum is not dominated by
one (or few) terms.

Beware of measurement errors with non-Gaussian tails.

Good example: velocity component vx of air molecules.


OK example: total deflection due to multiple Coulomb scattering.
(Rare large angle deflections give non-Gaussian tail.)
Bad example: energy loss of charged particle traversing thin
gas layer. (Rare collisions make up large fraction of energy loss,
cf. Landau pdf.)
G. Cowan Statistical Data Analysis / Stat 1 53
Multivariate Gaussian distribution
Multivariate Gaussian pdf for the vector

are column vectors, are transpose (row) vectors,

For n = 2 this is

where ρ = cov[x1, x2]/(σ1σ2) is the correlation coefficient.


G. Cowan Statistical Data Analysis / Stat 1 54
Chi-square (χ2) distribution
The chi-square pdf for the continuous r.v. z (z ≥ 0) is defined by

n = 1, 2, ... = number of ‘degrees of


freedom’ (dof)

For independent Gaussian xi, i = 1, ..., n, means µi, variances σi2,

follows χ2 pdf with n dof.

Example: goodness-of-fit test variable especially in conjunction


with method of least squares.
G. Cowan Statistical Data Analysis / Stat 1 55
Cauchy (Breit-Wigner) distribution
The Breit-Wigner pdf for the continuous r.v. x is defined by

(Γ = 2, x0 = 0 is the Cauchy pdf.)

E[x] not well defined, V[x] →∞.


x0 = mode (most probable value)
Γ = full width at half maximum

Example: mass of resonance particle, e.g. ρ, K*, φ0, ...


Γ = decay rate (inverse of mean lifetime)
G. Cowan Statistical Data Analysis / Stat 1 56
Landau distribution
For a charged particle with β = ν /c traversing a layer of matter
of thickness d, the energy loss Δ follows the Landau pdf:
Δ

β
+-+-
-+-+

L. Landau, J. Phys. USSR 8 (1944) 201; see also


W. Allison and J. Cobb, Ann. Rev. Nucl. Part. Sci. 30 (1980) 253.
G. Cowan Statistical Data Analysis / Stat 1 57
Landau distribution (2)

Long ‘Landau tail’


→ all moments ∞

Mode (most probable


value) sensitive to β ,
→ particle i.d.

G. Cowan Statistical Data Analysis / Stat 1 58


Beta distribution

Often used to represent pdf


of continuous r.v. nonzero only
between finite limits.

G. Cowan Statistical Data Analysis / Stat 1 59


Gamma distribution

Often used to represent pdf


of continuous r.v. nonzero only
in [0,∞].
Also e.g. sum of n exponential
r.v.s or time until nth event
in Poisson process ~ Gamma

G. Cowan Statistical Data Analysis / Stat 1 60


Student's t distribution

ν = number of degrees of freedom


(not necessarily integer)
ν = 1 gives Cauchy,
ν → ∞ gives Gaussian.

G. Cowan Statistical Data Analysis / Stat 1 61


Student's t distribution (2)
If x ~ Gaussian with µ = 0, σ2 = 1, and
z ~ χ2 with n degrees of freedom, then
t = x / (z/n)1/2 follows Student's t with ν = n.
This arises in problems where one forms the ratio of a sample
mean to the sample standard deviation of Gaussian r.v.s.
The Student's t provides a bell-shaped pdf with adjustable
tails, ranging from those of a Gaussian, which fall off very
quickly, (ν → ∞, but in fact already very Gauss-like for
ν = two dozen), to the very long-tailed Cauchy (ν = 1).
Developed in 1908 by William Gosset, who worked under
the pseudonym "Student" for the Guinness Brewery.

G. Cowan Statistical Data Analysis / Stat 1 62


Extra slides

G. Cowan Statistical Data Analysis / Stat 1 63


Theory ↔ Statistics ↔ Experiment
Theory (model, hypothesis): Experiment:

+ data
selection

+ simulation
of detector
and cuts

G. Cowan Statistical Data Analysis / Stat 1 64


Data analysis in particle physics
Observe events (e.g., pp collisions) and for each, measure
a set of characteristics:
particle momenta, number of muons, energy of jets,...
Compare observed distributions of these characteristics to
predictions of theory. From this, we want to:
Estimate the free parameters of the theory:

Quantify the uncertainty in the estimates:

Assess how well a given theory stands in agreement


with the observed data:

To do this we need a clear definition of PROBABILITY


G. Cowan Statistical Data Analysis / Stat 1 65
Data analysis in particle physics:
testing hypotheses
Test the extent to which a given model agrees with the data:
ALEPH, Phys. Rept. 294 (1998) 1-165

data
spin-1/2 quark
model “good”

spin-0 quark
model “bad”

In general need tests


with well-defined properties
and quantitative results.

G. Cowan Statistical Data Analysis / Stat 1 66

You might also like