lecture 16 – thurs, oct. 30 inference for regression (sections 7.3-7.4): –hypothesis tests and...

17
Lecture 16 – Thurs, Oct. 30 • Inference for Regression (Sections 7.3-7.4): – Hypothesis Tests and Confidence Intervals for Intercept and Slope – Confidence Intervals for mean response – Prediction Intervals • Next time: Robustness of least squares inferences, graphical tools for model assessment (8.1-8.3)

Post on 21-Dec-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Lecture 16 – Thurs, Oct. 30

• Inference for Regression (Sections 7.3-7.4):– Hypothesis Tests and Confidence Intervals for

Intercept and Slope– Confidence Intervals for mean response– Prediction Intervals

• Next time: Robustness of least squares inferences, graphical tools for model assessment (8.1-8.3)

Page 2: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Regression

• Goal of regression: Estimate the mean response Y for subpopulations X=x,

• Example: Y= neuron activity index, X=years playing stringed instrument

• Simple linear regression model:

• Estimate and by least squares – choose to minimize the sum of squared residuals (prediction errors)

}|{ XY

XXY 10}|{

0 110

ˆ,ˆ

Page 3: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Ideal Model

• Assumptions of ideal simple linear regression model – There is a normally distributed subpopulation of

responses for each value of the explanatory variable– The means of the subpopulations fall on a straight-line

function of the explanatory variable.– The subpopulation standard deviations are all equal (to )– The selection of an observation from any of the

subpopulations is independent of the selection of any other observation.

Page 4: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

The standard deviation

• is the standard deviation in each subpopulation.

• measures the accuracy of predictions from the regression.

• If the simple linear regression model holds, then approximately– 68% of the observations will fall within of the

regression line– 95% of the observations will fall within of the

regression line

2

Page 5: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Estimating

• Residuals provide basis for an estimate of

• Degrees of freedom for simple linear regression = n-2• If the simple linear regression models holds, then

approximately– 68% of the observations will fall within of the least

squares line– 95% of the observations will fall within of the least

squares line

freedom of degrees

residuals squared all of sumˆ

2

Page 6: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Inference for Simple Linear Regression

• Inference based on the ideal simple linear regression model holding.

• Inference based on taking repeated random samples ( ) from the same subpopulations

( ) as in the observed data.• Types of inference:

– Hypothesis tests for intercept and slope– Confidence intervals for intercept and slope– Confidence interval for mean of Y at X=X0

– Prediction interval for future Y for which X=X0

nyy ,,1

nxx ,,1

Page 7: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Hypothesis tests for and

• Hypothesis test of vs. – Based on t-test statistic, – p-value has usual interpretation, probability under the null

hypothesis that |t| would be at least as large as its observed value, small p-value is evidence against null hypothesis

• Hypothesis test for vs. is based on an analogous test statistic.

• Test statistics and p-values can be found on JMP output under parameter estimates, obtained by using fit line after fit Y by X.

01

0: 10 H 0: 1 aH

)ˆ(

|ˆ|

)(

||||

1

1

SEEstimateSE

Estimatet

0: 00 H 0: 0 aH

Page 8: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

JMP output for exampleB i v a r i a t e F i t o f N e u r o n a c t i v i t y i n d e x B y Y e a r s p l a y i n g

0

5

1 0

1 5

2 0

2 5

3 0

Neuron a

ctivity inde

x

0 5 1 0 1 5 2 0Y e a rs p la y in g

L in e a r F it L i n e a r F i t

N e u r o n a c t i v i t y i n d e x = 7 . 9 7 1 5 9 0 9 + 1 . 0 2 6 8 3 0 8 Y e a r s p l a y i n g S u m m a r y o f F i t

R S q u a r e 0 . 8 6 6 9 8 6 R S q u a r e A d j 0 . 8 5 5 9 0 2 R o o t M e a n S q u a r e E r r o r 3 . 0 2 5 1 0 1 M e a n o f R e s p o n s e 1 5 . 8 9 2 8 6 O b s e r v a t i o n s ( o r S u m W g t s ) 1 4 P a r a m e t e r E s t i m a t e s T e r m E s t i m a t e S t d E r r o r t R a t i o P r o b > | t |

I n t e r c e p t 7 . 9 7 1 5 9 0 9 1 . 2 0 6 5 9 8 6 . 6 1 < . 0 0 0 1 Y e a r s p l a y i n g 1 . 0 2 6 8 3 0 8 0 . 1 1 6 1 0 5 8 . 8 4 < . 0 0 0 1

Page 9: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Confidence Intervals for and

• Confidence intervals provide a range of plausible values for and

• 95% Confidence Intervals: • Finding CIs in JMP: Can find under parameter estimates after fitting line. Can

find in Table A.2. • For brain activity study, CIs

01

0 1

)ˆ()975(.ˆ020 SEtn

)ˆ()975(.ˆ121 SEtn

)ˆ(),ˆ(,ˆ,ˆ1010 SESE

)975(.2nt 2)975(.2 nt)60.10,34.5(207.1*179.2972.7:ˆ

0

)28.1,77.0(116.0*179.2027.1:1

Page 10: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Confidence Intervals for Mean of Y at X=X0

• What is a plausible range of values for • 95% CI for : • , • Note about formula

– Precision in estimating is not constant for all values of X. Precision decreases as X0 gets farther away from sample average of X’s

• JMP implementation: Use Confid Curves fit command under red triangle next to Linear Fit after using Fit Y by X, fit line. Use the crosshair tool to find the exact values of the confidence interval endpoints for a given X0.

}|{ 0XY}|{ 0XY }]|{ˆ[)975(.}|{ˆ 020 XYSEtXY n

2

20

0)1(

)(1ˆ}]|{ˆ[

Xsn

XX

nXYSE

0100ˆˆ}|{ˆ XXY

}|{ XY

Page 11: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Prediction Intervals

• What are likely values for a future value Y0 at some specified value of X (=X0)?

• The best single prediction of a future response at X0 is the estimated mean response:

• A prediction interval is an interval of likely values along with a

measure of the likelihood that interval will contain response.• 95% prediction interval for X0: If repeated samples are

obtained from the subpopulations and a prediction interval is formed, the prediction interval will contain the value of Y0 for a future observation from the subpopulation X0 95% of the time.

01000ˆˆ}|{ˆ}|{Pr XXYXYed

),,( 1 nYY ),,( 1 nxx

Page 12: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Prediction Intervals Cont.

• Prediction interval must account for two sources of uncertainty:– Uncertainty about the location of the subpopulation mean – Uncertainty about where the future value will be in

relation to its mean

• Prediction Error = Random Sampling Error + Estimation Error

}|{ 0XY

}|{ˆ}|{Pr 00 XYYXYedY

}]|{ˆ}|{[}]|{[ 000 XYXYXYY

Page 13: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Prediction Interval Formula

• 95% prediction interval at X0

• Compare to 95% CI for mean at X0:

– Prediction interval is wider due to random sampling error in future response

– As sample size n becomes large, margin of error of CI for mean goes to zero but margin of error of PI doesn’t.

• JMP implementation: Use Confid Curves Indiv command under red triangle next to Linear Fit after using Fit Y by X, fit line. Use the crosshair tool to find the exact values of the confidence interval endpoints for a given X0.

20

220 }]|{ˆ[ˆ)975(.}|{ˆ XYSEtXY n

}]|{ˆ[)975(.}|{ˆ 020 XYSEtXY n

Page 14: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Example

• A building maintenance company is planning to submit a bid on a contract to clean 40 corporate offices scattered throughout an office complex. The costs incurred by the maintenance company are proportional to the number of crews needed for this task. Currently the company has 11 crews. Will 11 crews be enough?

• Recent data are available for the number of rooms that were cleaned by varying number of crews. The data are in cleaning.jmp.

• Assuming a simple linear regression model holds, which is more relevant for answering the question of interest – a confidence interval for the mean number of rooms cleaned by 11 crews or a prediction interval for the number of rooms cleaned on a particular day by 11 crews?

Page 15: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Correlation

• Section 7.5.4• Correlation is a measure of the degree of linear

association between two variables X and Y. For each unit in population, both X and Y are measured.

• Population correlation =

• Correlation is between –1 and 1. Correlation of 0 indicates no linear association. Correlations near +1 indicates strong positive linear association; correlations near –1 indicate strong negative linear association.

)}]/(}}{[{ YXYX YXMean

Page 16: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Correlation and Regression• Features of correlation

– Dimension-free. Units of X and Y don’t matter.– Symmetric in X and Y. There is no “response” and

“explanatory” variable.• Correlation only measures degree of linear association. It is

possible for there to be an exact relationship between X and Y and yet sample correlation coefficient is zero.

• Correlation in JMP: Click multivariate and put variables in Y, columns.

• Connection to regression– Test of slope vs. is identical to test of

vs. . Test of correlation coefficient only makes sense if the pairs (X,Y) are randomly sampled from population.

0: 10 H0: 1 aH0:0 H

0: aH

Page 17: Lecture 16 – Thurs, Oct. 30 Inference for Regression (Sections 7.3-7.4): –Hypothesis Tests and Confidence Intervals for Intercept and Slope –Confidence

Correlation in JMP