statistical inference part ii point estimation

28
1 STATISTICAL INFERENCE PART II POINT ESTIMATION

Upload: caia

Post on 14-Jan-2016

111 views

Category:

Documents


1 download

DESCRIPTION

STATISTICAL INFERENCE PART II POINT ESTIMATION. SUFFICIENT STATISTICS. X, f(x; ), . X 1 , X 2 ,…,X n be a sample rvs. Y=U(X 1 , X 2 ,…,X n ) is a statistic. A sufficient statistic , Y is a statistic which contains all the information for the estimation of . - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: STATISTICAL INFERENCE PART II POINT ESTIMATION

1

STATISTICAL INFERENCEPART II

POINT ESTIMATION

Page 2: STATISTICAL INFERENCE PART II POINT ESTIMATION

2

SUFFICIENT STATISTICS

• X, f(x;),

• X1, X2,…,Xn be a sample rvs

• Y=U(X1, X2,…,Xn ) is a statistic.

• A sufficient statistic, Y is a statistic which contains all the information for the estimation of .

Page 3: STATISTICAL INFERENCE PART II POINT ESTIMATION

3

SUFFICIENT STATISTICS

• Given the value of Y, the sample contains no further information for the estimation of .

• Y is a sufficient statistic (ss) for if the conditional distribution h(x1,x2,…,xn|y) does not depend on for every given Y=y.

• A ss for is not unique.

• If Y is a ss for , then a 1-1 transformation of Y, say Y1=fn(Y) is also a ss for .

Page 4: STATISTICAL INFERENCE PART II POINT ESTIMATION

4

SUFFICIENT STATISTICS• The conditional distribution of sample rvs

given the value of y of Y, is defined as

1 21 2

; , , ,, , ,

;n

n

L x x xh x x x y

g y

1 21 2

, , , , ;, , ,

;n

n

f x x x yh x x x y

g y

• If Y is a ss for , then

1 21 2 1 2

; , , ,, , , , , ,

;n

n n

L x x xh x x x y H x x x

g y

ss for may include y or constant.

Not depend on for every given y.

• Also, the conditional range of Xi given y not depend on .

Page 5: STATISTICAL INFERENCE PART II POINT ESTIMATION

5

SUFFICIENT STATISTICS

EXAMPLE: X~Ber(p). For a r.s. of size n, show that is a ss for p.

n

1iiX

Page 6: STATISTICAL INFERENCE PART II POINT ESTIMATION

6

SUFFICIENT STATISTICS

• Neyman’s Factorization Theorem: Y is a ss for iff

1 2 1 2; , , , nL k y k x x x

where k1 and k2 are non-negative functions and k2 does not depend on or y.

The likelihood function Does not contain any other xiNot depend on for every given y (also in the conditional range of xi.)

Page 7: STATISTICAL INFERENCE PART II POINT ESTIMATION

7

EXAMPLES

1. X~Ber(p). For a r.s. of size n, find a ss for p if exists.

Page 8: STATISTICAL INFERENCE PART II POINT ESTIMATION

8

EXAMPLES

2. X~Beta(θ,2). For a r.s. of size n, find a ss for θ.

Page 9: STATISTICAL INFERENCE PART II POINT ESTIMATION

9

SUFFICIENT STATISTICS

• A ss may not exist.

• Jointly ss Y1,Y2,…,Yk may be needed. Example: Example 10.2.5 in Bain and Engelhardt (page 342 in 2nd edition), X(1) and X(n)

are jointly ss for

• If the MLE of exists and unique and if a ss for exists, then MLE is a function of a ss for .

Page 10: STATISTICAL INFERENCE PART II POINT ESTIMATION

10

EXAMPLE

X~N(,2). For a r.s. of size n, find jss for and 2.

Page 11: STATISTICAL INFERENCE PART II POINT ESTIMATION

MINIMAL SUFFICIENT STATISTICS

• If is a ss for θ, then,

is also a SS

for θ. But, the first one does a better job in data reduction. A minimal ss achieves the greatest possible reduction.

11

))x(s),...,x(s()x(S~

k~

1~

))x(s),...,x(s),x(s()x(S~

k~

1~

0~

*

Page 12: STATISTICAL INFERENCE PART II POINT ESTIMATION

12

MINIMAL SUFFICIENT STATISTICS• A ss T(X) is called minimal ss if, for any

other ss T’(X), T(x) is a function of T’(x).• THEOREM: Let f(x;) be the pmf or pdf of

a sample X1, X2,…,Xn. Suppose there exist a function T(x) such that, for two sample points x1,x2,…,xn and y1,y2,…,yn, the ratio

is constant as a function of iff T(x)=T(y). Then, T(X) is a minimal sufficient statistic for .

1 2

1 2

, , , ;

, , , ;n

n

f x x x

f y y y

Page 13: STATISTICAL INFERENCE PART II POINT ESTIMATION

13

EXAMPLE

• X~N(,2) where 2 is known. For a r.s. of size n, find minimal ss for .

Note: A minimal ss is also not unique. Any 1-to-1 function is also a minimal ss.

Page 14: STATISTICAL INFERENCE PART II POINT ESTIMATION

14

RAO-BLACKWELL THEOREM

• Let X1, X2,…,Xn have joint pdf or pmf f(x1,x2,…,xn;) and let S=(S1,S2,…,Sk) be a vector of jss for . If T is an UE of () and (T)=E(TS), then

i (T) is an UE of () .ii (T) is a fn of S, so it is also jss for .iii) Var((T) ) Var(T) for all . (T) is a uniformly better unbiased estimator

of () .

Page 15: STATISTICAL INFERENCE PART II POINT ESTIMATION

RAO-BLACKWELL THEOREM

• Notes:(T)=E(TS) is at least as good as T.

• For finding the best UE, it is enough to consider UEs that are functions of a ss, because all such estimators are at least as good as the rest of the UEs.

15

Page 16: STATISTICAL INFERENCE PART II POINT ESTIMATION

Example

• Hogg & Craig, Exercise 10.10

• X1,X2~Exp(θ)

• Find joint p.d.f. of ss Y1=X1+X2 for θ and Y2=X2.

• Show that Y2 is UE of θ with variance θ².

• Find φ(y1)=E(Y2|Y1) and variance of φ(Y1).

16

Page 17: STATISTICAL INFERENCE PART II POINT ESTIMATION

17

ANCILLARY STATISTIC

• A statistic S(X) whose distribution does not depend on the parameter is called an ancillary statistic.

• An ancillary statistic contains no information about .

Page 18: STATISTICAL INFERENCE PART II POINT ESTIMATION

Example

• Example 6.1.8 in Casella & Berger, page 257:

Let Xi~Unif(θ,θ+1) for i=1,2,…,n

Then, range R=X(n)-X(1) is an ancillary statistic because its pdf does not depend on θ.

18

Page 19: STATISTICAL INFERENCE PART II POINT ESTIMATION

19

COMPLETENESS

• Let {f(x; ), } be a family of pdfs (or pmfs) and U(x) be an arbitrary function of x not depending on . If

requires that the function itself equal to 0 for all possible values of x; then we say that this family is a complete family of pdfs (or pmfs).

0 for all E U X

0 for all 0 for all .E U X U x x

Page 20: STATISTICAL INFERENCE PART II POINT ESTIMATION

20

EXAMPLES

1. Show that the family {Bin(n=2,); 0<<1} is complete.

Page 21: STATISTICAL INFERENCE PART II POINT ESTIMATION

21

EXAMPLES

2. X~Uniform(,). Show that the family {f(x;), >0} is not complete.

Page 22: STATISTICAL INFERENCE PART II POINT ESTIMATION

22

BASU THEOREM• If T(X) is a complete and minimal sufficient

statistic, then T(X) is independent of every ancillary statistic.

• Example: X~N(,2).

: the mss for X

(n-1)S2/ 2 ~2

1n Ancillary statistic for

By Basu theorem, and S2 are independent.X

S2

statisticcompleteaisX

.familycompleteis)n/,(Noffamilyand)n/,(N~X 22

Page 23: STATISTICAL INFERENCE PART II POINT ESTIMATION

23

COMPLETE AND SUFFICIENT STATISTICS (css)

• Y is a complete and sufficient statistic (css) for if Y is a ss for and the family

; ;g y is complete. The pdf of Y.

1) Y is a ss for .

2) u(Y) is an arbitrary function of Y. E(u(Y))=0 for all implies that u(y)=0 for all possible Y=y.

Page 24: STATISTICAL INFERENCE PART II POINT ESTIMATION

THE MINIMUM VARIANCE UNBIASED ESTIMATOR

• Rao-Blackwell Theorem: If T is an unbiased estimator of , and S is a ss for , then (T)=E(TS) is– an UE of , i.e.,E[(T)]=E[E(TS)]= and

– the MVUE of .

24

Page 25: STATISTICAL INFERENCE PART II POINT ESTIMATION

25

LEHMANN-SCHEFFE THEOREM

• Let Y be a css for . If there is a function Y which is an UE of , then the function is the unique Minimum Variance Unbiased Estimator (UMVUE) of .

• Y css for .

• T(y)=fn(y) and E[T(Y)]=.

T(Y) is the UMVUE of . So, it is the best estimator of .

Page 26: STATISTICAL INFERENCE PART II POINT ESTIMATION

26

THE MINIMUM VARIANCE UNBIASED ESTIMATOR

• Let Y be a css for . Since Y is complete, there could be only a unique function of Y which is an UE of .

• Let U1(Y) and U2(Y) be two function of Y. Since they are UE’s, E(U1(Y)U2(Y))=0 imply W(Y)=U1(Y)U2(Y)=0 for all possible values of Y. Therefore, U1(Y)=U2(Y) for all Y.

Page 27: STATISTICAL INFERENCE PART II POINT ESTIMATION

Example

• Let X1,X2,…,Xn ~Poi(μ). Find UMVUE of μ.

• Solution steps:– Show that is css for μ.

– Find a statistics (such as S*) that is UE of μ and a function of S.

– Then, S* is UMVUE of μ by Lehmann-Scheffe Thm.

27

n

1iiXS

Page 28: STATISTICAL INFERENCE PART II POINT ESTIMATION

Note

• The estimator found by Rao-Blackwell Thm may not be unique. But, the estimator found by Lehmann-Scheffe Thm is unique.

28