probabilitiesandstatistics-lecture8olariu/curent/ps/files/probability8_en.pdf · steps, and each...

50
Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Upload: others

Post on 27-Mar-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics - Lecture 8

Olariu E. Florentin

April 6, 2020

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Table of contents

Continuous Random VariablesContinuous Random VariablesRemarkable Continuous Distributions

The fundamental lawsThe law of large numbers

Markov and Tchebychev inequalities revisited

Tchebychev's theorem

The Law of Large Numbers

The central limit theoremNormal approximation to the binomial distribution

Computer simulationSimulation of random variablesIllustrations of LLN and CLT

Bibliography

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Random Events

� When jj > jRj (i.e., has, at least, an continuous cardinal),random events are de�ned in a di�erent manner.

� The most notably di�erence in de�nition is that it is possibleto exist subsets A � that are not random events: therandom events family forms a �-algeba A � P():� ?; 2 A;

� if A1;A2 2 A, then A1 \A2 2 A;

� if (An)n>1 � A, then[n>1

An 2 A.

� The probability function is de�ned only on A (with knownaxioms):

P : A ! [0; 1]:

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Continuous Random Variables

� A function X : ! R is called random variable if

for every J � R;X�1(J ) 2 A:� A random variable X : ! R is called continuous if itsdistribution function is a continuous one (Sometimes, this

de�nition addresses the situation when X () has a con-

tinuous cardinal).

� The distribution of such a variable is given by its distribu-tion function:

F : R! [0; 1];F (a) = P(X 6 a);

� or by its probability density function, f : R ! [0;+1),such that F can be described like follows

F (a) = P(X 6 a) =

aZ�1

f (t) dt :

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Continuous Random Variables

� Any function f : R ! [0;+1), such that

1Z�1

f (t) dt = 1, is

the density function for a certain continuous random vari-able (or simply a continuous distribution).

� Using the probability density function we can compute (ifthe integrals exist) the expectation and the variance:

E[X ] =

+1Z�1

tf (t) dt and Var [X ] =

+1Z�1

(t � E[X ])2 f (t) dt :

� If h : R! R is a real (say, continuous) function, and X is arandom variable with the density f , then h(X ) is a randomvariable having the following expected value

E[h(X )] =

Z1

�1

h(t)f (t) dt :

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Continuous Random Variables

� The associated probabilities are computed like this

P(a < X 6 b) = F (b)� F (a) =

bZa

f (t) dt

which is the area under the graph of f between t = a andt = b.

� If F is continuous, P(X = a) = 0 and P(a 6 X < b) =

P(a < X 6 b) = P(a < X < b).

� For a given random variable X : ! R, standardizationconsists in the following transformation of X :

Y =X � E[X ]

StDev [X ]:

� The new variable is "standard", that is,

E[Y ] = 0 and Var [Y ] = 1:

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Continuous Random Variables - Examples

Example 1. The life time in years, of some electronic componentis a continuous random variable with the density

f (x ) =

8<:

k

x 4; x > 1

0; x < 1

Find k , its distribution function, and the probability for life-timeto exceed 2 years.

Solution. We must have f (t) > 0; 8t 2 R and

1Z�1

f (t) dt = 1,

therefore k > 0 and 1 =

1Z1

k

t4dt =

�� k

3t3

�1

1

=k

3which gives

k = 3.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Continuous Random Variables - Examples

The distribution function is F (x ) =

xZ�1

f (t) dt , therefore

F (x ) =

8<: 1� 1

x 3; x > 1

0; x < 1

Let X be the life time of this electronic component, the probabil-ity that the life-time exceeds 2 years is P(X > 2) = 1� P(X <

2) = 1� F (2) = 1=8 (because F is continuous).Example 2. Let X be a continuous random variable with thefollowing density function

f (x ) =

(�x ; 0 6 x 6 20; otherwise

Find �, its distribution function, the expectation and the vari-ance of X .

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Continuous Random Variables - Examples

Example 3. The time in minutes, it takes to reboot a certainsystem is a continuous variable with the density

f (t) =

(C (10� x )2; 0 < x < 10

0; otherwise

Compute C and the probability that it takes between 1 and 2minutes to reboot.Example 4. Lifetime in years, of a certain HD is a continuousrandom variable with density

f (t) =

8<: K � x

50; 0 < x < 10

0; otherwise

Find K , the probability of a failure within �rst 5 years, and theexpectation of the lifetime.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Remarkable Continuous Distributions

Uniform distribution. It is denoted by U (a ; b) and have thedensity function

f (t) =

8>><>>:

0; x < a1

b � a; x 2 [a ; b]

0; x > b

If X : U (a ; b), then E[X ] =a + b

2and Var [X ] =

(b � a)2

12.

U (0; 1) is called the standard uniform distribution.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Remarkable Continuous Distributions

Exponential distribution. It is abbreviated by Exp(�) and havethe density function (� > 0 is the rate parameter)

f (t) =

(0; x < 0

�e��x ; x > 0

X : Exp(�), E[X ] =1�;Var [X ] =

1�2

.

Exponential distribution is used to model waiting time, inter-

arrival time, hardware lifetime, failure time; in a sequence

of rare events the time between events is exponentially dis-

tributed.

The Exponential distribution is memoryless (having waited for x0minutes get forgotten): regardless of the event X > x , when thetotal waiting time exceeds x , the remaining waiting time still hasExponential distribution: P(X > x+�x jX > x ) = P(X > �x )

(why?).

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Remarkable Continuous Distributions

Gaussian (normal) distribution. It is denoted byN (�; �2) withthe density function

f (t) =1

�p2�� e�

(t � �)2

2�2 :

If X : N (�; �2), then E[X ] = � and Var [X ] = �2.The distribution N (0; 1) is called the standard normal distri-

bution.The values of a normal distributed variable have the followingspreading (symmetrical around the mean): %68 belongs to ainterval [���; �+�], %95 belongs to [��2�; �+2�], and %99:7belongs to [�� 3�; �+ 3�].

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Remarkable Continuous Distributions

� Normal distribution has a proeminent role in Probabilityand Statistics for at least two reasons.

� As a consequence of the Central Limit Theorem (CLT - seebelow) sums and/or averages of identical distributed inde-pendent random variables have approximatively a Normaldistribution.

� Normal distribution was found to be a good model for vari-ables like temperature, weight, height or even student grades.

� The Normal distribution was tacitly used by de Moivre asan approximation to the binomial distribution and was laterused by Laplace and Gauss.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Remarkable Continuous Distributions

Student (or t) distribution. It is denoted by t(r) with thedensity function

f (x ) =

8>>>><>>>>:

�r + 12

�pr��

�r

2

� �1+ x2

r

��

r+1

2 ; x > 0

0; x 6 0

;

where �(y) =

+1Z0

x y�1e�x dx . For a random variable X : t(r),

we have E[X ] = 0 and Var [X ] =r

r � 2.

The larger the number of degrees of freedom, the more the dis-tribution looks like the standard normal distribution.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Remarkable Continuous Distributions

Gamma distribution. It is denoted by �(�; �) with the densityfunction

f (x ) =

8<:

��

�(�)x��1e��x ; x > 0

0; x 6 0;

where �(t) =

+1Z0

x t�1e�x dx . � is the shape parameter and �

is the rate (or the frequency) parameter. For a random variable

X : �(�; �), we have E[X ] =�

�and Var [X ] =

�2.

Suppose that we have a process that consists of � independent

steps, and each step takes Exp(�) amount of time, then the

total time has Gamma distribution.

That is, Gamma distribution is a sum of � independent Expo-nential variables.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Markov and Tchebychev inequalities

Proposition 1

Let X > 0 be a non-negative random variable. If a > 0,then

P(X > a) 6E[X ]

a:

proof:

E[X ] =

Z +1

0

tf (t)dt =

Z a

0

tf (t)dt +

Z +1

atf (t)dt >

Z +1

atf (t)dt > a

Z +1

af (t)dt = aP(X > a):

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Tchebychev's inequality

Proposition 2

Let X be a random variable having expectation � and

variance �2. Then

P(jX � �j > k) 6�2

k2:

proof: Consider the variable Y = (X � �)2 and a = k2 inMarkov inequality

P(jX � �j > k) = Ph(X � �)2 > k2

i6E�(X � �)2

�k2

=�2

k2:

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Tchebychev's theorem

Theorem 1.1

Let (Xn)n>1 be a sequence of independent random vari-

ables having �nite variances, uniformly bounded, that is

Var [Xn ] 6 c, for every n > 1. Then

limn!1

P

����� 1nnX

i=1

Xi � 1n

nXi=1

E[Xi ]

����� < �

!= 1:

proof: We know that

E

"1n

nXi=1

Xi

#=

1n

nXi=1

E[Xi ] and

Var

"1n

nXi=1

Xi

#=

1n2

nXi=1

Var [Xi ] <c

n:

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Tchebychev's theorem

Applying the Tchebychev's inequality for the variable1n

nXi=1

Xi

we get

1 > P

����� 1nnX

i=1

Xi � 1n

nXi=1

E[Xi ]

����� < �

!> 1�

Var

"1n

nXi=1

Xi

#

�2> 1� c

n�2:

Taking to the limit we obtain

limn!1

P

����� 1nnX

i=1

Xi � 1n

nXi=1

E[Xi ]

����� < �

!= 1:�

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

The weak law of large numbers

� The laws of large numbers say that as the number of iden-tically distributed, randomly generated variables increases,their sample mean approaches their theoretical mean.

Theorem 1.2

(The weak law of large numbers, Khintchine's law) Let

(Xn)n>1 be a sequence of identical distributed independent

random variables having mean � and variance �2. Then

limn!1

P

����� 1nnX

i=1

Xi � �]

����� < �

!= 1 or lim

n!1P

����� 1nnX

i=1

Xi � �]

����� > �

!= 0:

proof: It is a consequnce of the previous theorem, since1n

nXi=1

E[Xi ] =

�.�

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

The strong law of large numbers

Theorem 1.3

(The strong law of large numbers) Let (Xn)n>1 be a se-

quence of identical distributed independent random vari-

ables having mean � and variance �2. Then

P

limn!1

1n

nXi=1

Xi = �

!= 1:

proof: It is more complex and will be omitted.�

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

An example - frequencies

� Bernoulli which is credited with the �rst proof of the weaklaw of large numbers, formulated a version that address theBernoulli distribution.

� Suppose that we have a random experience an a related ran-dom event A with P(A) = p.

� We independently repeat the experience, and consider thefollowing sequence of random variables: Xi = 1 if A occursat the ith performing, and 0 otherwise.

� The variables are independent and identical Bernoulli dis-tributed with expectation p.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

An example - frequencies

� The law of large numbers says that, with probability 1,

1n

nXi=1

Xi ! p:

�nX

i=1

Xi is the number of occurrences of A after n performings.

� In other words the law of large numbers says that the A

occurs with frequency p.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Some History

� James Bernoulli proved the weak law of large numbers in1700; Poisson generalized his result around 1800.

� Tchebychev discovered his inequality in 1866, and Markovextended Bernoulli's theorem to dependent random vari-ables.

� In 1909 the Émile Borel proved what today is known asthe strong law of large numbers that further generalizesBernoulli's theorem.

� In 1926 Kolmogorov derived a more general condition thatwere su�cient for a set of mutually independent randomvariables to obey the law of large numbers. This conditionis X

n>1

Var [Xn ]

n2< +1:

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

The central limit theorem

Theorem 2.1

(The central limit theorem, Lindeberg-Lévy) Let (Xn)n>1be a sequence of identical distributed independent random

variables having mean � and variance �2. Then

1n

nXi=1

Xi � �

�pn

! N (0; 1) or

limn!1

P

0BBBB@

nXi=1

Xi � n�

�pn

6 a

1CCCCA =

1p2�

Z a

�1

exp (�t2=2)dt :

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

The central limit theorem

� The central limit theorem allows to estimate probabilitiesfor sum of independent variables.

� On the other hand, the theorem explains why so many pro-cesses (from social sciences, biology, psichology etc) followthe normal law.

� Essentially the central limit theorem says that, for large sam-ples (n > 30), the variable

nXi=1

Xi � n�

�pn

follows a standard normal law, N (0; 1).

� The central limit theorem holds even for dependent vari-ables, if their correlation is very small.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Bernoulli's theorem

Proposition 3

Let �n be a the number of occurrences of an event A

in n independent performings of a random experience.

If fn =�n

nis the relative frequency of occurrence of A,

then the sequence (fn)n>1 converges in probability to p,

the probability of A.

proof: �n = nfn is a binomial distributed variable, hence E[�n ] =

np and Var [�n ] = np(1� p). Moreover

P(jfn � pj < �) = P(j�n � npj < n�) = P(j�n � E[�n ]j < n�) >

> 1� p(1� p)

n�2:

Obviously, limn!1

P(jfn � pj < �) = 1, for every � > 0. �

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Normal approximation to the binomial distribution

� Let Xn be a sequence of Bernoulli(p) independent variables.

� X =nX

i=1

Xi is a binomial distributed variable, B(n ; p).

� Using the central limit theorem we get the de Moivre-Laplacetheorem which says that for large values of n the variable

Y =X � E[X ]pVar [X ]

=X � nppnp(1� p)

is a standard normal distributed variable (N (0; 1)).

� The estimation is good for np(1� p) > 10.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Normal approximation to the binomial distribution

Theorem 2.2

(de Moivre-Laplace theorem) When k is around np, as n

grows large we have

n

k

!pk (1� p)n�k �

exp� (k�np)2

2np(1�p)p2�np(1� p)

:

� Consider the following example: let X be the number of tailoccurrences in 40 �ippings of a fair coin.

� What is P(X = 20)?

P(X = 20) = P(19:5 6 X 6 20:5) =

= P

�19:5� 20p

106

X � 20p10

620:5� 20p

10

�=

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

The continuity correction

P

��0:16 6 X � 20p

106 0:16

��

� �(0:16)� �(�0:16) = 0:1272;

where �(�) is the distribution function of N (0; 1).

� Continuity correction is an adjustment that is made when-ever a discrete distribution is approximated by a continuousone.

� P(X = 10) = P(9:5 6 X 6 10:5), P(X > 15) = P(X >

15:5), P(X < 13) = P(X 6 12:5).

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Generate uniform random numbers

� When we talk about random numbers we often understandvalues of an uniform random variable.

� There are two types of uniform random variables: discreteand continuous.

� For example in order to choose uniformly at random an in-teger between 1 and n (sometimes between 0 and (n � 1))we have to generate a value of a discrete random variableUn .

� On the other hand for choosing uniformly at random a num-ber in [0; 1] we have to generate a value of a continuous ran-dom variable U[0;1].

� Generally speaking, to simulate a certain random variable

means to generate values that follows its distribution.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

D[

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Generate random numbers

� Almost every programming language contains random num-ber generators of both types; we will use the random numbergenerators from R.

� We review the R commands for commonly employed discreteand continuous distributions.

� Functions that start with p; q ; d and r give the (cumulative)distribution function - CDF, the inverse CDF, the probabil-ity density function - PDF, and (a value of a) a randomvariable having the speci�ed distribution respectively.

� For generating discrete uniform random number one can usethe sample() function.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Generate random numbers

Distribution Commands

Binomial pbinom() qbinom() dbinom() rbinom()

Geometric pgeom() qgeom() dgeom() rgeom()

Poisson ppois() qpois() dpois() rpois()

Uniform punif () qunif () dunif () runif ()

Exponential pexp() qexp() dexp() rexp()

Normal pnorm() qnorm() dnorm() rnorm()

Student pt() qt() dt() rt()

Gamma pgamma() qgamma() dgamma() rgamma()

� You can �nd details about all these function using help(name)

in R or Rstudio.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Generate random numbers

� In order to simulate a discrete random variable all we needto know is its distribution.

X :

x1 x2 : : : xk : : :

p1 p2 : : : pk : : :

!

� We simulate X like follows: we generate an uniform random

number U and return xi ifi�1Xj=1

pj 6 U <iX

j=1

pj .

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Example 1. (LLN - Bu�on's needle problem) The problem(stated in 1733 and �rst solved in 1777 by french naturalist andmathematician Comte de Bu�on) asks to �nd the probabilitythat a needle of lenght l will cross a line, given a straight surfacewith equally spaced parallel lines at distance 2d .Suppose that the needle length is less than the distance betweenthe lines (the easiest situation to analyse); there are two variablesthat determine the relative position of the needle to the closestline: the angle, x , at which the needle falls and the distance fromthe middle of the needle to this line, y .The needle will cross the closest line if and only if y 6 l=2 sin x ,for every x 2 [0; �].

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

All the cases are completely described by the pairs (x ; y) 2[0; �] � [0; d ], and the favorable cases are the pairs belongingto the area under the graph of the function f : [0; �] ! R,f (x ) = l=2 sin x .Thus, the probability isZ �

0

f (x ) dx

� � d =1�d

Z �

0

l

2sin x dx =

l

2�d[� cos x ]�

0=

l

�d:

For l = d = 1, that is the needle length is half of the distancebetween the lines, the probability is 1=�.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Introduce the random experiment of launching the needle andde�ne a Bernoulli variable, X , with value 1 if and only if theneedle cross a line; the probability of success and the expectationof X is 1=�.If we independently repeat n times this experiment we will get ann size sample (Xi )i=1;n . Because of the Law of Large Numbersxn ! 1=�, thus, for large enough values of n ,

xn =number of successes

n� 1

�:

This kind of relation could be used to obtain an experimentalapproximation of �. Several needle casters already performedthis experiment.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Example 2. (Verifying LLN) Consider a given probability dis-tribution, X , having mean � and variance �2, and a sequence of nindependent identical distributed random variables Xi , i = 1;n .The Law of Large Numbers says that in a certain probabilisticsense, the sample mean converges to the known mean:

xn ! � as n !1

Let us verify this law using the Poisson distribution with di�erent� parameters (for a Poisson distribution � = �).

� 2 3 4 6 8 12 15xn 1:955 2:977 4:003 6:027 8:018 12:093 14:925

We observe that the resulted statistics (n = 5000) are very closeto the corresponding known expectations. (The samples are ob-tained using rpois(n ; �).)

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

If we repeat the former test with Gamma distribution for di�erentpairs of (�; �) parameters (the expectation is � = �=�) we get

� 2 2 3 4 6 6 6 12� 1:5 2 2 3 5 4 8 4xn 1:361 1:009 1:489 1:345 1:204 1:501 0:752 2:973� 1:333 1:000 1:500 1:333 1:200 1:500 0:750 3:000

The resulted sample means (n = 5000) are very close to thecorresponding expectations. (The samples are obtained usingrgamma(n ; �).)

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Example 3. (CLT - de Moivre-Laplace) The ideal size of a�rst-year class at a particular college is 150 students. The col-lege, knowing from past experience that, on the average, only30% of those accepted for admission will actually attend, uses apolicy of approving the applications of 450 students. Computethe probability that at least 151 �rst-year students attend thiscollege.Solution. Let X the number of students that attend; we canassume that each accepted applicant will independently attend.Thus X : B(450; 0:3) and

P(X > 150) = P(X > 150:5) = P

X � nppnp(1� p)

>150:5� nppnp(1� p)

!=

= P

�X � 135p

81>

15:5p81

�� P(Z > 1:722)

where Z : N (0; 1). Hence P(X > 150) � 1 � pnorm(1:722) =0:0425.

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Example 4. (CLT) The weights of a population of workers havemean 167 and standard deviation 27. If a sample of 36 workersis chosen, what is the probability that the sample mean of theirweights lies between 163 and 170?Solution. Let us denote by xn the sample mean, from CLT,xn � �

�=pn

approximately follows a standard normal distribution,

therefore

P(163 6 xn 6 170) = P

�163� 167

4:56

xn � 1674:5

6170� 167

4:5

�=

= P

��0:888 6 xn � 167

4:56 0:888

�� P(�0:888 6 Z 6 0:888) =

= pnorm(0:888)�pnorm(�0:888) = 2�pnorm(0:888)�1 = 0:625

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Example 5. (Verifying CLT) Consider a given probability dis-tribution, X , with mean � and variance �2, and a sequence of nindependent identical distributed random variables Xi , i = 1;n .According to CLT, for large n , the sample mean, xn , has a nor-mal distribution, N (�; �2=n).We want to verify this assertion and take N such sample meansand build a histogram. For our examples we used the geometricdistribution G(0:35) and the Exponential distribution Exp(5)(n = 50, N = 10000).

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

Example 6. (Verifying CLT) Consider a given probability dis-tribution, X , having mean � and variance �2, and a sequence of nindependent identical distributed random variables Xi , i = 1;n .This sequence can always be viewed as a sample; if xn is thesample mean, CLT says that

limn!1

P

�xn � �

�=pn6 z

�= P(Z 6 z );

where Z : N (0; 1). Usually, for large values of n we can makethe following approximation

Pn(z ) = P

�xn � �

�=pn6 z

�� P(Z 6 z ):

A method to verify if this approximation is a good one: choose

independently N samples (sequences)�X k

i

�k=1;N

i=1;n, and compute

PN =jfk : x k

n 6 z�=pn + �gj

N:

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

That is, PN is the number of samples (from those N ) satis-

fying the inequalityxn � �

�=pn6 z , over the number of samples.

This statistic should approximate P [Z 6 z ]. For Exponentialdistribution with � = 2, n = 50, and N = 2000 the resultsare tabulated below (a sample of size n can be obtained withrexp(n ; �)).

z �1:5 �1:0 �0:5 0 0:5 1:0 1:5PN (z ) 0:055 0:154 0:313 0:509 0:723 0:831 0:931Abs :err 16% 2:5% 1:6% 1:8% 4:6% 1:8% 0:2%pnorm(z ) 0:066 0:158 0:308 0:5 0:691 0:847 0:933

The absolute error is equal withjP(Z 6 z )� PN (z )j

P(Z 6 z ).

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Illustrations of LLN and CLT

For computing PN (z ) we used the following algorithm� 1=�;� 1=�; // why?

c z � �=pn + �;

j 1;for(i = 1;N )

if(mean(rexp(n ; �)) 6 c)

j++;

return j =N ;

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

End

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and StatisticsProbabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Probabilities and Statistics Probabilities and Statistics Probabilities and Statistics

Bibliography I

Baron, M., Probability and Statistics for Computer Scien-

tist, Chapman & Hall/CRC Press, 2013 or the electronic edi-tion https://ww2.ii.uj.edu.pl/�z1099839/naukowe/RP/rps-michael-byron.pdf

Johnosn, J. L., Probability and Statistics for Computer

Science, Wiley Interscience, 2008.

Lipschutz, S., Theory and Problems of Probability,Schaum's Outline Series, McGraw-Hill, 1965.

Ross, S. M., A First Course in Probability , Prentice Hall,5th edition, 1998.

Shao, J., Mathematical Statistics, Springer Verlag, 1998.

Stone, C. J., A Course in Probability and Statistics,Duxbury Press, 1996.