![Page 1: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/1.jpg)
MULTIPLE REGRESSION ANALYSIS:INFERENCE
Huseyin Tastan1
1Yıldız Technical UniversityDepartment of Economics
These presentation notes are based onIntroductory Econometrics: A Modern Approach (2nd ed.)
by J. Wooldridge.
14 Ekim 2012
Econometrics I: Multiple Regression: Inference - H. Tastan 1
![Page 2: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/2.jpg)
Multiple Regression Analysis: Inference
In this class we will learn how to carry out hypothesis tests onpopulation parameters.
Under the assumption that “Population error term (u) isnormally distributed” (MLR.6) we will examine the samplingdistributions of OLS estimators.
First we will learn how to carry out hypothesis tests on singlepopulation parameters.
Then, we will develop testing methods for multiple linearrestrictions.
We will also learn how to decide whether a group ofexplanatory variables can be excluded from the model.
Econometrics I: Multiple Regression: Inference - H. Tastan 2
![Page 3: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/3.jpg)
Multiple Regression Analysis: Inference
In this class we will learn how to carry out hypothesis tests onpopulation parameters.
Under the assumption that “Population error term (u) isnormally distributed” (MLR.6) we will examine the samplingdistributions of OLS estimators.
First we will learn how to carry out hypothesis tests on singlepopulation parameters.
Then, we will develop testing methods for multiple linearrestrictions.
We will also learn how to decide whether a group ofexplanatory variables can be excluded from the model.
Econometrics I: Multiple Regression: Inference - H. Tastan 2
![Page 4: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/4.jpg)
Multiple Regression Analysis: Inference
In this class we will learn how to carry out hypothesis tests onpopulation parameters.
Under the assumption that “Population error term (u) isnormally distributed” (MLR.6) we will examine the samplingdistributions of OLS estimators.
First we will learn how to carry out hypothesis tests on singlepopulation parameters.
Then, we will develop testing methods for multiple linearrestrictions.
We will also learn how to decide whether a group ofexplanatory variables can be excluded from the model.
Econometrics I: Multiple Regression: Inference - H. Tastan 2
![Page 5: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/5.jpg)
Multiple Regression Analysis: Inference
In this class we will learn how to carry out hypothesis tests onpopulation parameters.
Under the assumption that “Population error term (u) isnormally distributed” (MLR.6) we will examine the samplingdistributions of OLS estimators.
First we will learn how to carry out hypothesis tests on singlepopulation parameters.
Then, we will develop testing methods for multiple linearrestrictions.
We will also learn how to decide whether a group ofexplanatory variables can be excluded from the model.
Econometrics I: Multiple Regression: Inference - H. Tastan 2
![Page 6: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/6.jpg)
Multiple Regression Analysis: Inference
In this class we will learn how to carry out hypothesis tests onpopulation parameters.
Under the assumption that “Population error term (u) isnormally distributed” (MLR.6) we will examine the samplingdistributions of OLS estimators.
First we will learn how to carry out hypothesis tests on singlepopulation parameters.
Then, we will develop testing methods for multiple linearrestrictions.
We will also learn how to decide whether a group ofexplanatory variables can be excluded from the model.
Econometrics I: Multiple Regression: Inference - H. Tastan 2
![Page 7: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/7.jpg)
Sampling Distributions of OLS Estimators
To make statistical inference (hypothesis tests, confidenceintervals), in addition to expected values and variances weneed to know the sampling distributions of βjs.
To do this we need to assume that the error term is normallydistributed. Under the Gauss-Markov assumptions thesampling distributions of OLS estimators can have any shape.
Assumption MLR.6 Normality
Population error term u is independent of the explanatory variablesand follows a normal distribution with mean 0 and variance σ2:
u ∼ N(0, σ2)
Normality assumption is stronger than the previousassumptions.
Assumption MLR.6 implies that MLR.3, Zero conditionalmean, and MLR.5, homoscedasticity, are also satisfied.
Econometrics I: Multiple Regression: Inference - H. Tastan 3
![Page 8: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/8.jpg)
Sampling Distributions of OLS Estimators
To make statistical inference (hypothesis tests, confidenceintervals), in addition to expected values and variances weneed to know the sampling distributions of βjs.
To do this we need to assume that the error term is normallydistributed. Under the Gauss-Markov assumptions thesampling distributions of OLS estimators can have any shape.
Assumption MLR.6 Normality
Population error term u is independent of the explanatory variablesand follows a normal distribution with mean 0 and variance σ2:
u ∼ N(0, σ2)
Normality assumption is stronger than the previousassumptions.
Assumption MLR.6 implies that MLR.3, Zero conditionalmean, and MLR.5, homoscedasticity, are also satisfied.
Econometrics I: Multiple Regression: Inference - H. Tastan 3
![Page 9: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/9.jpg)
Sampling Distributions of OLS Estimators
To make statistical inference (hypothesis tests, confidenceintervals), in addition to expected values and variances weneed to know the sampling distributions of βjs.
To do this we need to assume that the error term is normallydistributed. Under the Gauss-Markov assumptions thesampling distributions of OLS estimators can have any shape.
Assumption MLR.6 Normality
Population error term u is independent of the explanatory variablesand follows a normal distribution with mean 0 and variance σ2:
u ∼ N(0, σ2)
Normality assumption is stronger than the previousassumptions.
Assumption MLR.6 implies that MLR.3, Zero conditionalmean, and MLR.5, homoscedasticity, are also satisfied.
Econometrics I: Multiple Regression: Inference - H. Tastan 3
![Page 10: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/10.jpg)
Sampling Distributions of OLS Estimators
To make statistical inference (hypothesis tests, confidenceintervals), in addition to expected values and variances weneed to know the sampling distributions of βjs.
To do this we need to assume that the error term is normallydistributed. Under the Gauss-Markov assumptions thesampling distributions of OLS estimators can have any shape.
Assumption MLR.6 Normality
Population error term u is independent of the explanatory variablesand follows a normal distribution with mean 0 and variance σ2:
u ∼ N(0, σ2)
Normality assumption is stronger than the previousassumptions.
Assumption MLR.6 implies that MLR.3, Zero conditionalmean, and MLR.5, homoscedasticity, are also satisfied.
Econometrics I: Multiple Regression: Inference - H. Tastan 3
![Page 11: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/11.jpg)
Sampling Distributions of OLS Estimators
To make statistical inference (hypothesis tests, confidenceintervals), in addition to expected values and variances weneed to know the sampling distributions of βjs.
To do this we need to assume that the error term is normallydistributed. Under the Gauss-Markov assumptions thesampling distributions of OLS estimators can have any shape.
Assumption MLR.6 Normality
Population error term u is independent of the explanatory variablesand follows a normal distribution with mean 0 and variance σ2:
u ∼ N(0, σ2)
Normality assumption is stronger than the previousassumptions.
Assumption MLR.6 implies that MLR.3, Zero conditionalmean, and MLR.5, homoscedasticity, are also satisfied.
Econometrics I: Multiple Regression: Inference - H. Tastan 3
![Page 12: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/12.jpg)
Sampling Distributions of OLS Estimators
Assumptions MLR.1 through MLR.6 are called classicalassumptions. (Gauss-Markov assumptions + Normality)
Under the classical assumptions, OLS estimators βjs are thebest unbiased estimators in not only all linear estimators butall estimators (including nonlinear estimators).
Classical assumptions can be summarized as follows:
y|x ∼ N(β0 + β1x1 + β2x2 + . . .+ βkxk, σ2)
Econometrics I: Multiple Regression: Inference - H. Tastan 4
![Page 13: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/13.jpg)
Sampling Distributions of OLS Estimators
Assumptions MLR.1 through MLR.6 are called classicalassumptions. (Gauss-Markov assumptions + Normality)
Under the classical assumptions, OLS estimators βjs are thebest unbiased estimators in not only all linear estimators butall estimators (including nonlinear estimators).
Classical assumptions can be summarized as follows:
y|x ∼ N(β0 + β1x1 + β2x2 + . . .+ βkxk, σ2)
Econometrics I: Multiple Regression: Inference - H. Tastan 4
![Page 14: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/14.jpg)
Sampling Distributions of OLS Estimators
Assumptions MLR.1 through MLR.6 are called classicalassumptions. (Gauss-Markov assumptions + Normality)
Under the classical assumptions, OLS estimators βjs are thebest unbiased estimators in not only all linear estimators butall estimators (including nonlinear estimators).
Classical assumptions can be summarized as follows:
y|x ∼ N(β0 + β1x1 + β2x2 + . . .+ βkxk, σ2)
Econometrics I: Multiple Regression: Inference - H. Tastan 4
![Page 15: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/15.jpg)
Normality assumptions in the simple regression
![Page 16: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/16.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 17: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/17.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 18: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/18.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 19: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/19.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 20: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/20.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 21: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/21.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 22: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/22.jpg)
How can we justify the normality assumption?
u is the sum of many different unobserved factors affecting y.
Therefore, we can invoke the Central Limit Theorem (CLT) toconclude that u has an approximate normal distribution.
CLT assumes that unobserved factors in u affect y in anadditive fashion.
If u is a complicated function of unobserved factors then theCLT may not apply.
Normality: usually an empirical matter.
In some cases, normality assumption may be violated, forexample, distribution of wages may not be normal (positivevalues, minimum wage laws, etc.). In practice, we assume thatconditional distribution is close to being normal.
In some cases, transformations of variables (e.g., natural log)may yield an approximately normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 6
![Page 23: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/23.jpg)
Sampling Distributions of OLS Estimators
Normal Sampling Distributions
Under the assumptions MLR.1 through MLR.6 OLS estimatorsfollow a normal distributions (conditional on xs):
βj ∼ N(βj ,Var(βj)
)Standardizing we obtain:
βj − βjsd(βj)
∼ N(0, 1)
OLS estimators can be written as a linear combination of errorterms. Recall that linear combinations of normally distributedrandom variables also follow normal distribution.
Econometrics I: Multiple Regression: Inference - H. Tastan 7
![Page 24: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/24.jpg)
Testing Hypotheses about a Single Population Parameter:The t Test
βj − βjsd(βj)
∼ N(0, 1)
Replacing the standard deviation (sd) in the denominator bystandard error (se):
βj − βjse(βj)
∼ tn−k−1
The t test is used in testing hypotheses about a singlepopulation parameter as in H0 : βj = β∗j .
Econometrics I: Multiple Regression: Inference - H. Tastan 8
![Page 25: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/25.jpg)
Testing Hypotheses about a Single Population Parameter:The t Test
βj − βjsd(βj)
∼ N(0, 1)
Replacing the standard deviation (sd) in the denominator bystandard error (se):
βj − βjse(βj)
∼ tn−k−1
The t test is used in testing hypotheses about a singlepopulation parameter as in H0 : βj = β∗j .
Econometrics I: Multiple Regression: Inference - H. Tastan 8
![Page 26: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/26.jpg)
The t Test
Testing Against One-Sided Alternatives (Right Tail)
H0 : βj = 0
H1 : βj > 0
The meaning of the null hypothesis: after controlling for theimpacts of x1, x2, . . . , xj−1, xj+1, . . . , xk, xj has no effect onthe expected value of y.
Test statistic
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: reject the null hypothesis if tβj is larger than
the 100α% critical value associated with tn−k−1 distribution.
If tβj > c, then REJECT H0
otherwise fail to reject H0
Econometrics I: Multiple Regression: Inference - H. Tastan 9
![Page 27: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/27.jpg)
The t Test
Testing Against One-Sided Alternatives (Right Tail)
H0 : βj = 0
H1 : βj > 0
The meaning of the null hypothesis: after controlling for theimpacts of x1, x2, . . . , xj−1, xj+1, . . . , xk, xj has no effect onthe expected value of y.
Test statistic
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: reject the null hypothesis if tβj is larger than
the 100α% critical value associated with tn−k−1 distribution.
If tβj > c, then REJECT H0
otherwise fail to reject H0
Econometrics I: Multiple Regression: Inference - H. Tastan 9
![Page 28: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/28.jpg)
The t Test
Testing Against One-Sided Alternatives (Right Tail)
H0 : βj = 0
H1 : βj > 0
The meaning of the null hypothesis: after controlling for theimpacts of x1, x2, . . . , xj−1, xj+1, . . . , xk, xj has no effect onthe expected value of y.
Test statistic
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: reject the null hypothesis if tβj is larger than
the 100α% critical value associated with tn−k−1 distribution.
If tβj > c, then REJECT H0
otherwise fail to reject H0
Econometrics I: Multiple Regression: Inference - H. Tastan 9
![Page 29: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/29.jpg)
The t Test
Testing Against One-Sided Alternatives (Right Tail)
H0 : βj = 0
H1 : βj > 0
The meaning of the null hypothesis: after controlling for theimpacts of x1, x2, . . . , xj−1, xj+1, . . . , xk, xj has no effect onthe expected value of y.
Test statistic
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: reject the null hypothesis if tβj is larger than
the 100α% critical value associated with tn−k−1 distribution.
If tβj > c, then REJECT H0
otherwise fail to reject H0
Econometrics I: Multiple Regression: Inference - H. Tastan 9
![Page 30: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/30.jpg)
5% Decision Rule with dof=28
![Page 31: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/31.jpg)
The t Test
Testing Against One-Sided Alternatives (Left Tail)
H0 : βj = 0
H1 : βj < 0
The test statistic:
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: If calculated test statistic tβj is smaller than
the critical value at the chosen significance level we reject thenull hypothesis:
If tβj < −c, then REJECT H0
otherwise, fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 11
![Page 32: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/32.jpg)
The t Test
Testing Against One-Sided Alternatives (Left Tail)
H0 : βj = 0
H1 : βj < 0
The test statistic:
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: If calculated test statistic tβj is smaller than
the critical value at the chosen significance level we reject thenull hypothesis:
If tβj < −c, then REJECT H0
otherwise, fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 11
![Page 33: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/33.jpg)
The t Test
Testing Against One-Sided Alternatives (Left Tail)
H0 : βj = 0
H1 : βj < 0
The test statistic:
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: If calculated test statistic tβj is smaller than
the critical value at the chosen significance level we reject thenull hypothesis:
If tβj < −c, then REJECT H0
otherwise, fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 11
![Page 34: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/34.jpg)
Decision Rule for the left tail test, dof=18
![Page 35: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/35.jpg)
The t Test
Testing Against Two-Sided Alternatives
H0 : βj = 0
H1 : βj 6= 0
The test statistic:
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: If the absolute value of the test statistic |tβj | is
larger than the critical value at the 100α/2 significance level(c = tn−k−1,α/2) then we reject H0:.
If |tβj | > c, then REJECT H0
otherwise, we fail to reject.
Econometrics I: Multiple Regression: Inference - H. Tastan 13
![Page 36: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/36.jpg)
The t Test
Testing Against Two-Sided Alternatives
H0 : βj = 0
H1 : βj 6= 0
The test statistic:
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: If the absolute value of the test statistic |tβj | is
larger than the critical value at the 100α/2 significance level(c = tn−k−1,α/2) then we reject H0:.
If |tβj | > c, then REJECT H0
otherwise, we fail to reject.
Econometrics I: Multiple Regression: Inference - H. Tastan 13
![Page 37: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/37.jpg)
The t Test
Testing Against Two-Sided Alternatives
H0 : βj = 0
H1 : βj 6= 0
The test statistic:
tβj =βj
se(βj)∼ tn−k−1
Decision Rule: If the absolute value of the test statistic |tβj | is
larger than the critical value at the 100α/2 significance level(c = tn−k−1,α/2) then we reject H0:.
If |tβj | > c, then REJECT H0
otherwise, we fail to reject.
Econometrics I: Multiple Regression: Inference - H. Tastan 13
![Page 38: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/38.jpg)
Decision Rule for Two-Sided Alternatives at 5%Significance Level, dof=25
![Page 39: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/39.jpg)
The t Test: Examples
Log-level Wage Equation: wage1.gdt
log(wage) = 0.284(0.104)
+ 0.092(0.007)
educ + 0.004(0.0017)
exper + 0.022(0.003)
tenure
n = 526 R2 = 0.316
(standard errors in parentheses)
Is exper statistically significant? Test H0 : βexper = 0 againstH1 : βexper > 0
The t-statistic is: tβj = 0.004/0.0017 = 2.41
One-sided critical value at 5% significance level isc0.05 = 1.645, at 1% level c0.01 = 2.326, dof = 526-4=522Since tβj > 2.326 we reject H0. Exper is statistically
significant at 1% level.βexper is statistically greater than zero at the 1% significancelevel.
Econometrics I: Multiple Regression: Inference - H. Tastan 15
![Page 40: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/40.jpg)
The t Test: Examples
Log-level Wage Equation: wage1.gdt
log(wage) = 0.284(0.104)
+ 0.092(0.007)
educ + 0.004(0.0017)
exper + 0.022(0.003)
tenure
n = 526 R2 = 0.316
(standard errors in parentheses)
Is exper statistically significant? Test H0 : βexper = 0 againstH1 : βexper > 0
The t-statistic is: tβj = 0.004/0.0017 = 2.41
One-sided critical value at 5% significance level isc0.05 = 1.645, at 1% level c0.01 = 2.326, dof = 526-4=522Since tβj > 2.326 we reject H0. Exper is statistically
significant at 1% level.βexper is statistically greater than zero at the 1% significancelevel.
Econometrics I: Multiple Regression: Inference - H. Tastan 15
![Page 41: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/41.jpg)
The t Test: Examples
Log-level Wage Equation: wage1.gdt
log(wage) = 0.284(0.104)
+ 0.092(0.007)
educ + 0.004(0.0017)
exper + 0.022(0.003)
tenure
n = 526 R2 = 0.316
(standard errors in parentheses)
Is exper statistically significant? Test H0 : βexper = 0 againstH1 : βexper > 0
The t-statistic is: tβj = 0.004/0.0017 = 2.41
One-sided critical value at 5% significance level isc0.05 = 1.645, at 1% level c0.01 = 2.326, dof = 526-4=522Since tβj > 2.326 we reject H0. Exper is statistically
significant at 1% level.βexper is statistically greater than zero at the 1% significancelevel.
Econometrics I: Multiple Regression: Inference - H. Tastan 15
![Page 42: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/42.jpg)
The t Test: Examples
Log-level Wage Equation: wage1.gdt
log(wage) = 0.284(0.104)
+ 0.092(0.007)
educ + 0.004(0.0017)
exper + 0.022(0.003)
tenure
n = 526 R2 = 0.316
(standard errors in parentheses)
Is exper statistically significant? Test H0 : βexper = 0 againstH1 : βexper > 0
The t-statistic is: tβj = 0.004/0.0017 = 2.41
One-sided critical value at 5% significance level isc0.05 = 1.645, at 1% level c0.01 = 2.326, dof = 526-4=522Since tβj > 2.326 we reject H0. Exper is statistically
significant at 1% level.βexper is statistically greater than zero at the 1% significancelevel.
Econometrics I: Multiple Regression: Inference - H. Tastan 15
![Page 43: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/43.jpg)
The t Test: Examples
Log-level Wage Equation: wage1.gdt
log(wage) = 0.284(0.104)
+ 0.092(0.007)
educ + 0.004(0.0017)
exper + 0.022(0.003)
tenure
n = 526 R2 = 0.316
(standard errors in parentheses)
Is exper statistically significant? Test H0 : βexper = 0 againstH1 : βexper > 0
The t-statistic is: tβj = 0.004/0.0017 = 2.41
One-sided critical value at 5% significance level isc0.05 = 1.645, at 1% level c0.01 = 2.326, dof = 526-4=522Since tβj > 2.326 we reject H0. Exper is statistically
significant at 1% level.βexper is statistically greater than zero at the 1% significancelevel.
Econometrics I: Multiple Regression: Inference - H. Tastan 15
![Page 44: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/44.jpg)
The t Test: Examples
Log-level Wage Equation: wage1.gdt
log(wage) = 0.284(0.104)
+ 0.092(0.007)
educ + 0.004(0.0017)
exper + 0.022(0.003)
tenure
n = 526 R2 = 0.316
(standard errors in parentheses)
Is exper statistically significant? Test H0 : βexper = 0 againstH1 : βexper > 0
The t-statistic is: tβj = 0.004/0.0017 = 2.41
One-sided critical value at 5% significance level isc0.05 = 1.645, at 1% level c0.01 = 2.326, dof = 526-4=522Since tβj > 2.326 we reject H0. Exper is statistically
significant at 1% level.βexper is statistically greater than zero at the 1% significancelevel.
Econometrics I: Multiple Regression: Inference - H. Tastan 15
![Page 45: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/45.jpg)
The t Test: Examples
Student Performance and School Size: meap93.gdt
math10 = 2.274(6.114)
+ 0.00046(0.0001)
totcomp + 0.048(0.0398)
staff− 0.0002(0.00022)
enroll
n = 408 R2 = 0.0541
math10: mathematics test results (a measure of student performance),totcomp: total compensation for teachers (a measure of teacher quality), staff:number of staff per 1000 students (a measure of how much attention studentsget), enroll: number of students (a measure of school size)
Test H0 : βenroll = 0 against H1 : βenroll < 0Calculated t-statistic: tβj = −0.0002/0.00022 = −0.91One-sided critical value at the 5% significance level:c0.05 = −1.645Since tβj > −1.645 we fail to reject H0.
βenroll is statistically insignificant (not different from zero) atthe 5% level.
Econometrics I: Multiple Regression: Inference - H. Tastan 16
![Page 46: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/46.jpg)
The t Test: Examples
Student Performance and School Size: meap93.gdt
math10 = 2.274(6.114)
+ 0.00046(0.0001)
totcomp + 0.048(0.0398)
staff− 0.0002(0.00022)
enroll
n = 408 R2 = 0.0541
math10: mathematics test results (a measure of student performance),totcomp: total compensation for teachers (a measure of teacher quality), staff:number of staff per 1000 students (a measure of how much attention studentsget), enroll: number of students (a measure of school size)
Test H0 : βenroll = 0 against H1 : βenroll < 0Calculated t-statistic: tβj = −0.0002/0.00022 = −0.91One-sided critical value at the 5% significance level:c0.05 = −1.645Since tβj > −1.645 we fail to reject H0.
βenroll is statistically insignificant (not different from zero) atthe 5% level.
Econometrics I: Multiple Regression: Inference - H. Tastan 16
![Page 47: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/47.jpg)
The t Test: Examples
Student Performance and School Size: meap93.gdt
math10 = 2.274(6.114)
+ 0.00046(0.0001)
totcomp + 0.048(0.0398)
staff− 0.0002(0.00022)
enroll
n = 408 R2 = 0.0541
math10: mathematics test results (a measure of student performance),totcomp: total compensation for teachers (a measure of teacher quality), staff:number of staff per 1000 students (a measure of how much attention studentsget), enroll: number of students (a measure of school size)
Test H0 : βenroll = 0 against H1 : βenroll < 0Calculated t-statistic: tβj = −0.0002/0.00022 = −0.91One-sided critical value at the 5% significance level:c0.05 = −1.645Since tβj > −1.645 we fail to reject H0.
βenroll is statistically insignificant (not different from zero) atthe 5% level.
Econometrics I: Multiple Regression: Inference - H. Tastan 16
![Page 48: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/48.jpg)
The t Test: Examples
Student Performance and School Size: meap93.gdt
math10 = 2.274(6.114)
+ 0.00046(0.0001)
totcomp + 0.048(0.0398)
staff− 0.0002(0.00022)
enroll
n = 408 R2 = 0.0541
math10: mathematics test results (a measure of student performance),totcomp: total compensation for teachers (a measure of teacher quality), staff:number of staff per 1000 students (a measure of how much attention studentsget), enroll: number of students (a measure of school size)
Test H0 : βenroll = 0 against H1 : βenroll < 0Calculated t-statistic: tβj = −0.0002/0.00022 = −0.91One-sided critical value at the 5% significance level:c0.05 = −1.645Since tβj > −1.645 we fail to reject H0.
βenroll is statistically insignificant (not different from zero) atthe 5% level.
Econometrics I: Multiple Regression: Inference - H. Tastan 16
![Page 49: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/49.jpg)
The t Test: Examples
Student Performance and School Size: meap93.gdt
math10 = 2.274(6.114)
+ 0.00046(0.0001)
totcomp + 0.048(0.0398)
staff− 0.0002(0.00022)
enroll
n = 408 R2 = 0.0541
math10: mathematics test results (a measure of student performance),totcomp: total compensation for teachers (a measure of teacher quality), staff:number of staff per 1000 students (a measure of how much attention studentsget), enroll: number of students (a measure of school size)
Test H0 : βenroll = 0 against H1 : βenroll < 0Calculated t-statistic: tβj = −0.0002/0.00022 = −0.91One-sided critical value at the 5% significance level:c0.05 = −1.645Since tβj > −1.645 we fail to reject H0.
βenroll is statistically insignificant (not different from zero) atthe 5% level.
Econometrics I: Multiple Regression: Inference - H. Tastan 16
![Page 50: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/50.jpg)
The t Test: Examples
Student Performance and School Size: meap93.gdt
math10 = 2.274(6.114)
+ 0.00046(0.0001)
totcomp + 0.048(0.0398)
staff− 0.0002(0.00022)
enroll
n = 408 R2 = 0.0541
math10: mathematics test results (a measure of student performance),totcomp: total compensation for teachers (a measure of teacher quality), staff:number of staff per 1000 students (a measure of how much attention studentsget), enroll: number of students (a measure of school size)
Test H0 : βenroll = 0 against H1 : βenroll < 0Calculated t-statistic: tβj = −0.0002/0.00022 = −0.91One-sided critical value at the 5% significance level:c0.05 = −1.645Since tβj > −1.645 we fail to reject H0.
βenroll is statistically insignificant (not different from zero) atthe 5% level.
Econometrics I: Multiple Regression: Inference - H. Tastan 16
![Page 51: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/51.jpg)
The t Test: Examples
Student Performance and School Size: Level-Log model
math10 = −207.67(48.7)
+ 21.16(4.06)
log(totcomp) + 3.98(4.19)
log(staff) − 1.27(0.69)
log(enroll)
n = 408 R2 = 0.065
Test H0 : βlog(enroll) = 0 against H1 : βlog(enroll) < 0
Calculated t-statistic: tβj = −1.27/0.69 = −1.84Critical value at 5%: c0.05 = −1.645Since tβj < −1.645 we reject H0 in favor of H1.
βlog(enroll) is statistically significant at the 5% significancelevel (smaller than zero).
Econometrics I: Multiple Regression: Inference - H. Tastan 17
![Page 52: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/52.jpg)
The t Test: Examples
Student Performance and School Size: Level-Log model
math10 = −207.67(48.7)
+ 21.16(4.06)
log(totcomp) + 3.98(4.19)
log(staff) − 1.27(0.69)
log(enroll)
n = 408 R2 = 0.065
Test H0 : βlog(enroll) = 0 against H1 : βlog(enroll) < 0
Calculated t-statistic: tβj = −1.27/0.69 = −1.84Critical value at 5%: c0.05 = −1.645Since tβj < −1.645 we reject H0 in favor of H1.
βlog(enroll) is statistically significant at the 5% significancelevel (smaller than zero).
Econometrics I: Multiple Regression: Inference - H. Tastan 17
![Page 53: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/53.jpg)
The t Test: Examples
Student Performance and School Size: Level-Log model
math10 = −207.67(48.7)
+ 21.16(4.06)
log(totcomp) + 3.98(4.19)
log(staff) − 1.27(0.69)
log(enroll)
n = 408 R2 = 0.065
Test H0 : βlog(enroll) = 0 against H1 : βlog(enroll) < 0
Calculated t-statistic: tβj = −1.27/0.69 = −1.84Critical value at 5%: c0.05 = −1.645Since tβj < −1.645 we reject H0 in favor of H1.
βlog(enroll) is statistically significant at the 5% significancelevel (smaller than zero).
Econometrics I: Multiple Regression: Inference - H. Tastan 17
![Page 54: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/54.jpg)
The t Test: Examples
Student Performance and School Size: Level-Log model
math10 = −207.67(48.7)
+ 21.16(4.06)
log(totcomp) + 3.98(4.19)
log(staff) − 1.27(0.69)
log(enroll)
n = 408 R2 = 0.065
Test H0 : βlog(enroll) = 0 against H1 : βlog(enroll) < 0
Calculated t-statistic: tβj = −1.27/0.69 = −1.84Critical value at 5%: c0.05 = −1.645Since tβj < −1.645 we reject H0 in favor of H1.
βlog(enroll) is statistically significant at the 5% significancelevel (smaller than zero).
Econometrics I: Multiple Regression: Inference - H. Tastan 17
![Page 55: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/55.jpg)
The t Test: Examples
Student Performance and School Size: Level-Log model
math10 = −207.67(48.7)
+ 21.16(4.06)
log(totcomp) + 3.98(4.19)
log(staff) − 1.27(0.69)
log(enroll)
n = 408 R2 = 0.065
Test H0 : βlog(enroll) = 0 against H1 : βlog(enroll) < 0
Calculated t-statistic: tβj = −1.27/0.69 = −1.84Critical value at 5%: c0.05 = −1.645Since tβj < −1.645 we reject H0 in favor of H1.
βlog(enroll) is statistically significant at the 5% significancelevel (smaller than zero).
Econometrics I: Multiple Regression: Inference - H. Tastan 17
![Page 56: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/56.jpg)
The t Test: Examples
Student Performance and School Size: Level-Log model
math10 = −207.67(48.7)
+ 21.16(4.06)
log(totcomp) + 3.98(4.19)
log(staff) − 1.27(0.69)
log(enroll)
n = 408 R2 = 0.065
Test H0 : βlog(enroll) = 0 against H1 : βlog(enroll) < 0
Calculated t-statistic: tβj = −1.27/0.69 = −1.84Critical value at 5%: c0.05 = −1.645Since tβj < −1.645 we reject H0 in favor of H1.
βlog(enroll) is statistically significant at the 5% significancelevel (smaller than zero).
Econometrics I: Multiple Regression: Inference - H. Tastan 17
![Page 57: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/57.jpg)
The t Test: Examples
Determinants of College GPA, gpa1.gdt
colGPA = 1.389(0.331)
+ 0.412(0.094)
hsGPA + 0.015(0.011)
ACT− 0.083(0.026)
skipped
n = 141 R2 = 0.23
skipped: average number of lectures missed per week
Which variables are statistically significant using two-sidedalternative?Two-sided critical value at the 5% significance level isc0.025 = 1.96. Because dof=141-4=137 we can use standardnormal critical values.thsGPA = 4.38: hsGPA is statistically significant.tACT = 1.36: ACT is statistically insignificant.tskipped = −3.19: skipped is statistically significant at the 1%level (c = 2.58).
Econometrics I: Multiple Regression: Inference - H. Tastan 18
![Page 58: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/58.jpg)
The t Test: Examples
Determinants of College GPA, gpa1.gdt
colGPA = 1.389(0.331)
+ 0.412(0.094)
hsGPA + 0.015(0.011)
ACT− 0.083(0.026)
skipped
n = 141 R2 = 0.23
skipped: average number of lectures missed per week
Which variables are statistically significant using two-sidedalternative?Two-sided critical value at the 5% significance level isc0.025 = 1.96. Because dof=141-4=137 we can use standardnormal critical values.thsGPA = 4.38: hsGPA is statistically significant.tACT = 1.36: ACT is statistically insignificant.tskipped = −3.19: skipped is statistically significant at the 1%level (c = 2.58).
Econometrics I: Multiple Regression: Inference - H. Tastan 18
![Page 59: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/59.jpg)
The t Test: Examples
Determinants of College GPA, gpa1.gdt
colGPA = 1.389(0.331)
+ 0.412(0.094)
hsGPA + 0.015(0.011)
ACT− 0.083(0.026)
skipped
n = 141 R2 = 0.23
skipped: average number of lectures missed per week
Which variables are statistically significant using two-sidedalternative?Two-sided critical value at the 5% significance level isc0.025 = 1.96. Because dof=141-4=137 we can use standardnormal critical values.thsGPA = 4.38: hsGPA is statistically significant.tACT = 1.36: ACT is statistically insignificant.tskipped = −3.19: skipped is statistically significant at the 1%level (c = 2.58).
Econometrics I: Multiple Regression: Inference - H. Tastan 18
![Page 60: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/60.jpg)
The t Test: Examples
Determinants of College GPA, gpa1.gdt
colGPA = 1.389(0.331)
+ 0.412(0.094)
hsGPA + 0.015(0.011)
ACT− 0.083(0.026)
skipped
n = 141 R2 = 0.23
skipped: average number of lectures missed per week
Which variables are statistically significant using two-sidedalternative?Two-sided critical value at the 5% significance level isc0.025 = 1.96. Because dof=141-4=137 we can use standardnormal critical values.thsGPA = 4.38: hsGPA is statistically significant.tACT = 1.36: ACT is statistically insignificant.tskipped = −3.19: skipped is statistically significant at the 1%level (c = 2.58).
Econometrics I: Multiple Regression: Inference - H. Tastan 18
![Page 61: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/61.jpg)
The t Test: Examples
Determinants of College GPA, gpa1.gdt
colGPA = 1.389(0.331)
+ 0.412(0.094)
hsGPA + 0.015(0.011)
ACT− 0.083(0.026)
skipped
n = 141 R2 = 0.23
skipped: average number of lectures missed per week
Which variables are statistically significant using two-sidedalternative?Two-sided critical value at the 5% significance level isc0.025 = 1.96. Because dof=141-4=137 we can use standardnormal critical values.thsGPA = 4.38: hsGPA is statistically significant.tACT = 1.36: ACT is statistically insignificant.tskipped = −3.19: skipped is statistically significant at the 1%level (c = 2.58).
Econometrics I: Multiple Regression: Inference - H. Tastan 18
![Page 62: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/62.jpg)
The t Test: Examples
Determinants of College GPA, gpa1.gdt
colGPA = 1.389(0.331)
+ 0.412(0.094)
hsGPA + 0.015(0.011)
ACT− 0.083(0.026)
skipped
n = 141 R2 = 0.23
skipped: average number of lectures missed per week
Which variables are statistically significant using two-sidedalternative?Two-sided critical value at the 5% significance level isc0.025 = 1.96. Because dof=141-4=137 we can use standardnormal critical values.thsGPA = 4.38: hsGPA is statistically significant.tACT = 1.36: ACT is statistically insignificant.tskipped = −3.19: skipped is statistically significant at the 1%level (c = 2.58).
Econometrics I: Multiple Regression: Inference - H. Tastan 18
![Page 63: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/63.jpg)
Testing Other Hypotheses about βj
The t Test
Null hypothesis isH0 : βj = aj
test statistic is
t =βj − ajse(βj)
∼ tn−k−1
or
t =estimate− hypothesized value
standard error
t statistic measures how many estimated standard deviationsβj is away from the hypothesized value.
Depending on the alternative hypothesis (left tail, right tail,two-sided) the decision rule is the same as before.
Econometrics I: Multiple Regression: Inference - H. Tastan 19
![Page 64: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/64.jpg)
Testing Other Hypotheses about βj
The t Test
Null hypothesis isH0 : βj = aj
test statistic is
t =βj − ajse(βj)
∼ tn−k−1
or
t =estimate− hypothesized value
standard error
t statistic measures how many estimated standard deviationsβj is away from the hypothesized value.
Depending on the alternative hypothesis (left tail, right tail,two-sided) the decision rule is the same as before.
Econometrics I: Multiple Regression: Inference - H. Tastan 19
![Page 65: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/65.jpg)
Testing Other Hypotheses about βj
The t Test
Null hypothesis isH0 : βj = aj
test statistic is
t =βj − ajse(βj)
∼ tn−k−1
or
t =estimate− hypothesized value
standard error
t statistic measures how many estimated standard deviationsβj is away from the hypothesized value.
Depending on the alternative hypothesis (left tail, right tail,two-sided) the decision rule is the same as before.
Econometrics I: Multiple Regression: Inference - H. Tastan 19
![Page 66: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/66.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and university size: campus.gdt
crime = exp(β0)enrollβ1 exp(u)
Taking natural log:
log(crime) = β0 + β1 log(enroll) + u
Data set: contains annual number of crimes and enrollmentfor 97 universities in USA
We wan to test:H0 : β1 = 1
H1 : β1 > 1
Econometrics I: Multiple Regression: Inference - H. Tastan 20
![Page 67: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/67.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and university size: campus.gdt
crime = exp(β0)enrollβ1 exp(u)
Taking natural log:
log(crime) = β0 + β1 log(enroll) + u
Data set: contains annual number of crimes and enrollmentfor 97 universities in USA
We wan to test:H0 : β1 = 1
H1 : β1 > 1
Econometrics I: Multiple Regression: Inference - H. Tastan 20
![Page 68: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/68.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and university size: campus.gdt
crime = exp(β0)enrollβ1 exp(u)
Taking natural log:
log(crime) = β0 + β1 log(enroll) + u
Data set: contains annual number of crimes and enrollmentfor 97 universities in USA
We wan to test:H0 : β1 = 1
H1 : β1 > 1
Econometrics I: Multiple Regression: Inference - H. Tastan 20
![Page 69: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/69.jpg)
Crime and enrollment: crime = enrollβ1
![Page 70: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/70.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and enrollment: campus.gdt
log(crime) = −6.63(1.03)
+ 1.27(0.11)
log(enroll)
n = 97 R2 = 0.585
Test: H0 : β1 = 1, H1 : β1 > 1
Calculated test statistic
t =1.27− 1
0.11≈ 2.45 ∼ t95
Critical value at the 5% significance level: c=1.66(dof = 120), thus we reject H0.
Can we say that we measured the ceteris paribus effect ofuniversity size? What other factors should we consider?
Econometrics I: Multiple Regression: Inference - H. Tastan 22
![Page 71: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/71.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and enrollment: campus.gdt
log(crime) = −6.63(1.03)
+ 1.27(0.11)
log(enroll)
n = 97 R2 = 0.585
Test: H0 : β1 = 1, H1 : β1 > 1
Calculated test statistic
t =1.27− 1
0.11≈ 2.45 ∼ t95
Critical value at the 5% significance level: c=1.66(dof = 120), thus we reject H0.
Can we say that we measured the ceteris paribus effect ofuniversity size? What other factors should we consider?
Econometrics I: Multiple Regression: Inference - H. Tastan 22
![Page 72: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/72.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and enrollment: campus.gdt
log(crime) = −6.63(1.03)
+ 1.27(0.11)
log(enroll)
n = 97 R2 = 0.585
Test: H0 : β1 = 1, H1 : β1 > 1
Calculated test statistic
t =1.27− 1
0.11≈ 2.45 ∼ t95
Critical value at the 5% significance level: c=1.66(dof = 120), thus we reject H0.
Can we say that we measured the ceteris paribus effect ofuniversity size? What other factors should we consider?
Econometrics I: Multiple Regression: Inference - H. Tastan 22
![Page 73: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/73.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and enrollment: campus.gdt
log(crime) = −6.63(1.03)
+ 1.27(0.11)
log(enroll)
n = 97 R2 = 0.585
Test: H0 : β1 = 1, H1 : β1 > 1
Calculated test statistic
t =1.27− 1
0.11≈ 2.45 ∼ t95
Critical value at the 5% significance level: c=1.66(dof = 120), thus we reject H0.
Can we say that we measured the ceteris paribus effect ofuniversity size? What other factors should we consider?
Econometrics I: Multiple Regression: Inference - H. Tastan 22
![Page 74: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/74.jpg)
Testing Other Hypotheses about βj: Example
Campus crime and enrollment: campus.gdt
log(crime) = −6.63(1.03)
+ 1.27(0.11)
log(enroll)
n = 97 R2 = 0.585
Test: H0 : β1 = 1, H1 : β1 > 1
Calculated test statistic
t =1.27− 1
0.11≈ 2.45 ∼ t95
Critical value at the 5% significance level: c=1.66(dof = 120), thus we reject H0.
Can we say that we measured the ceteris paribus effect ofuniversity size? What other factors should we consider?
Econometrics I: Multiple Regression: Inference - H. Tastan 22
![Page 75: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/75.jpg)
Testing Other Hypotheses about βj: Example
Housing Prices and Air Pollution: hprice2.gdt
Dependent variable: log of the median house price (log(price))Explanatory variables:log(nox): the amount of nitrogen oxide in the air in the community,log(dist): distance to employment centers,rooms: average number of rooms in houses in the community,stratio: average student-teacher ratio of schools in the community
Test: H0 : βlog(nox) = −1 against H1 : βlog(nox) 6= −1Estimated value: βlog(nox) = −0.954, standard error = 0.117
Test statistic:
t =−0.954− (−1)
0.117=−0.954 + 1
0.117≈ 0.39 ∼ t501 ∼ N(0, 1)
Two-sided critical value at the 5% significance level is c=1.96.Thus, we fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 23
![Page 76: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/76.jpg)
Testing Other Hypotheses about βj: Example
Housing Prices and Air Pollution: hprice2.gdt
Dependent variable: log of the median house price (log(price))Explanatory variables:log(nox): the amount of nitrogen oxide in the air in the community,log(dist): distance to employment centers,rooms: average number of rooms in houses in the community,stratio: average student-teacher ratio of schools in the community
Test: H0 : βlog(nox) = −1 against H1 : βlog(nox) 6= −1Estimated value: βlog(nox) = −0.954, standard error = 0.117
Test statistic:
t =−0.954− (−1)
0.117=−0.954 + 1
0.117≈ 0.39 ∼ t501 ∼ N(0, 1)
Two-sided critical value at the 5% significance level is c=1.96.Thus, we fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 23
![Page 77: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/77.jpg)
Testing Other Hypotheses about βj: Example
Housing Prices and Air Pollution: hprice2.gdt
Dependent variable: log of the median house price (log(price))Explanatory variables:log(nox): the amount of nitrogen oxide in the air in the community,log(dist): distance to employment centers,rooms: average number of rooms in houses in the community,stratio: average student-teacher ratio of schools in the community
Test: H0 : βlog(nox) = −1 against H1 : βlog(nox) 6= −1Estimated value: βlog(nox) = −0.954, standard error = 0.117
Test statistic:
t =−0.954− (−1)
0.117=−0.954 + 1
0.117≈ 0.39 ∼ t501 ∼ N(0, 1)
Two-sided critical value at the 5% significance level is c=1.96.Thus, we fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 23
![Page 78: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/78.jpg)
Testing Other Hypotheses about βj: Example
Housing Prices and Air Pollution: hprice2.gdt
Dependent variable: log of the median house price (log(price))Explanatory variables:log(nox): the amount of nitrogen oxide in the air in the community,log(dist): distance to employment centers,rooms: average number of rooms in houses in the community,stratio: average student-teacher ratio of schools in the community
Test: H0 : βlog(nox) = −1 against H1 : βlog(nox) 6= −1Estimated value: βlog(nox) = −0.954, standard error = 0.117
Test statistic:
t =−0.954− (−1)
0.117=−0.954 + 1
0.117≈ 0.39 ∼ t501 ∼ N(0, 1)
Two-sided critical value at the 5% significance level is c=1.96.Thus, we fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 23
![Page 79: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/79.jpg)
Testing Other Hypotheses about βj: Example
Housing Prices and Air Pollution: hprice2.gdt
Dependent variable: log of the median house price (log(price))Explanatory variables:log(nox): the amount of nitrogen oxide in the air in the community,log(dist): distance to employment centers,rooms: average number of rooms in houses in the community,stratio: average student-teacher ratio of schools in the community
Test: H0 : βlog(nox) = −1 against H1 : βlog(nox) 6= −1Estimated value: βlog(nox) = −0.954, standard error = 0.117
Test statistic:
t =−0.954− (−1)
0.117=−0.954 + 1
0.117≈ 0.39 ∼ t501 ∼ N(0, 1)
Two-sided critical value at the 5% significance level is c=1.96.Thus, we fail to reject H0.
Econometrics I: Multiple Regression: Inference - H. Tastan 23
![Page 80: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/80.jpg)
Computing p-values for t-tests
Instead of choosing a significance level (e.g. 1%, 5%, 10%),we can compute the smallest significance level at which thenull hypothesis would be rejected.
This is called p-value.
In standard regression softwares p-values are reported forH0 : βj = 0 against two-sided alternative.
In this case, p-value gives us the probability of drawing anumber from the t distribution which is larger than theabsolute value of the calculated t-statistic:
P (|T | > |t|)
The smaller the p-value the greater the evidence against thenull hypothesis.
Econometrics I: Multiple Regression: Inference - H. Tastan 24
![Page 81: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/81.jpg)
Computing p-values for t-tests
Instead of choosing a significance level (e.g. 1%, 5%, 10%),we can compute the smallest significance level at which thenull hypothesis would be rejected.
This is called p-value.
In standard regression softwares p-values are reported forH0 : βj = 0 against two-sided alternative.
In this case, p-value gives us the probability of drawing anumber from the t distribution which is larger than theabsolute value of the calculated t-statistic:
P (|T | > |t|)
The smaller the p-value the greater the evidence against thenull hypothesis.
Econometrics I: Multiple Regression: Inference - H. Tastan 24
![Page 82: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/82.jpg)
Computing p-values for t-tests
Instead of choosing a significance level (e.g. 1%, 5%, 10%),we can compute the smallest significance level at which thenull hypothesis would be rejected.
This is called p-value.
In standard regression softwares p-values are reported forH0 : βj = 0 against two-sided alternative.
In this case, p-value gives us the probability of drawing anumber from the t distribution which is larger than theabsolute value of the calculated t-statistic:
P (|T | > |t|)
The smaller the p-value the greater the evidence against thenull hypothesis.
Econometrics I: Multiple Regression: Inference - H. Tastan 24
![Page 83: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/83.jpg)
Computing p-values for t-tests
Instead of choosing a significance level (e.g. 1%, 5%, 10%),we can compute the smallest significance level at which thenull hypothesis would be rejected.
This is called p-value.
In standard regression softwares p-values are reported forH0 : βj = 0 against two-sided alternative.
In this case, p-value gives us the probability of drawing anumber from the t distribution which is larger than theabsolute value of the calculated t-statistic:
P (|T | > |t|)
The smaller the p-value the greater the evidence against thenull hypothesis.
Econometrics I: Multiple Regression: Inference - H. Tastan 24
![Page 84: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/84.jpg)
Computing p-values for t-tests
Instead of choosing a significance level (e.g. 1%, 5%, 10%),we can compute the smallest significance level at which thenull hypothesis would be rejected.
This is called p-value.
In standard regression softwares p-values are reported forH0 : βj = 0 against two-sided alternative.
In this case, p-value gives us the probability of drawing anumber from the t distribution which is larger than theabsolute value of the calculated t-statistic:
P (|T | > |t|)
The smaller the p-value the greater the evidence against thenull hypothesis.
Econometrics I: Multiple Regression: Inference - H. Tastan 24
![Page 85: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/85.jpg)
p-value: Example
![Page 86: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/86.jpg)
Large Standard Errors and Small t Statistics
As the sample size (n) gets bigger the standard errors of βjsbecome smaller.
Therefore, as n becomes larger it is more appropriate to usesmall significance levels ( such as 1%).
One reason for large standard errors in practice may be due tohigh collinearity among explanatory variables(multicollinearity).
If explanatory variables are highly correlated it may be difficultto determine the partial effects of variables.
In this case the best we can do is to collect more data.
Econometrics I: Multiple Regression: Inference - H. Tastan 26
![Page 87: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/87.jpg)
Large Standard Errors and Small t Statistics
As the sample size (n) gets bigger the standard errors of βjsbecome smaller.
Therefore, as n becomes larger it is more appropriate to usesmall significance levels ( such as 1%).
One reason for large standard errors in practice may be due tohigh collinearity among explanatory variables(multicollinearity).
If explanatory variables are highly correlated it may be difficultto determine the partial effects of variables.
In this case the best we can do is to collect more data.
Econometrics I: Multiple Regression: Inference - H. Tastan 26
![Page 88: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/88.jpg)
Large Standard Errors and Small t Statistics
As the sample size (n) gets bigger the standard errors of βjsbecome smaller.
Therefore, as n becomes larger it is more appropriate to usesmall significance levels ( such as 1%).
One reason for large standard errors in practice may be due tohigh collinearity among explanatory variables(multicollinearity).
If explanatory variables are highly correlated it may be difficultto determine the partial effects of variables.
In this case the best we can do is to collect more data.
Econometrics I: Multiple Regression: Inference - H. Tastan 26
![Page 89: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/89.jpg)
Large Standard Errors and Small t Statistics
As the sample size (n) gets bigger the standard errors of βjsbecome smaller.
Therefore, as n becomes larger it is more appropriate to usesmall significance levels ( such as 1%).
One reason for large standard errors in practice may be due tohigh collinearity among explanatory variables(multicollinearity).
If explanatory variables are highly correlated it may be difficultto determine the partial effects of variables.
In this case the best we can do is to collect more data.
Econometrics I: Multiple Regression: Inference - H. Tastan 26
![Page 90: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/90.jpg)
Large Standard Errors and Small t Statistics
As the sample size (n) gets bigger the standard errors of βjsbecome smaller.
Therefore, as n becomes larger it is more appropriate to usesmall significance levels ( such as 1%).
One reason for large standard errors in practice may be due tohigh collinearity among explanatory variables(multicollinearity).
If explanatory variables are highly correlated it may be difficultto determine the partial effects of variables.
In this case the best we can do is to collect more data.
Econometrics I: Multiple Regression: Inference - H. Tastan 26
![Page 91: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/91.jpg)
Guidelines for Economic and Statistical Significance
Check for statistical significance: if significant discuss thepractical and economic significance using the magnitude ofthe coefficient.
If a variable is not statistically significant at the usual levels(1%, 5%, 10%) you may still discuss the economicsignificance and statistical significance using p-values.
Small t-statistics and wrong signs on coefficients: these can beignored in practice, they are statistically insignificant.
A significant variable that has the unexpected sign andpractically large effect is much more difficult to interpret. Thismay imply a problem associated with model specificationand/or data problems.
Econometrics I: Multiple Regression: Inference - H. Tastan 27
![Page 92: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/92.jpg)
Guidelines for Economic and Statistical Significance
Check for statistical significance: if significant discuss thepractical and economic significance using the magnitude ofthe coefficient.
If a variable is not statistically significant at the usual levels(1%, 5%, 10%) you may still discuss the economicsignificance and statistical significance using p-values.
Small t-statistics and wrong signs on coefficients: these can beignored in practice, they are statistically insignificant.
A significant variable that has the unexpected sign andpractically large effect is much more difficult to interpret. Thismay imply a problem associated with model specificationand/or data problems.
Econometrics I: Multiple Regression: Inference - H. Tastan 27
![Page 93: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/93.jpg)
Guidelines for Economic and Statistical Significance
Check for statistical significance: if significant discuss thepractical and economic significance using the magnitude ofthe coefficient.
If a variable is not statistically significant at the usual levels(1%, 5%, 10%) you may still discuss the economicsignificance and statistical significance using p-values.
Small t-statistics and wrong signs on coefficients: these can beignored in practice, they are statistically insignificant.
A significant variable that has the unexpected sign andpractically large effect is much more difficult to interpret. Thismay imply a problem associated with model specificationand/or data problems.
Econometrics I: Multiple Regression: Inference - H. Tastan 27
![Page 94: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/94.jpg)
Guidelines for Economic and Statistical Significance
Check for statistical significance: if significant discuss thepractical and economic significance using the magnitude ofthe coefficient.
If a variable is not statistically significant at the usual levels(1%, 5%, 10%) you may still discuss the economicsignificance and statistical significance using p-values.
Small t-statistics and wrong signs on coefficients: these can beignored in practice, they are statistically insignificant.
A significant variable that has the unexpected sign andpractically large effect is much more difficult to interpret. Thismay imply a problem associated with model specificationand/or data problems.
Econometrics I: Multiple Regression: Inference - H. Tastan 27
![Page 95: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/95.jpg)
Confidence Intervals
We know that:
tβj =βj
se(βj)∼ tn−k−1
Using this ratio we can construct the %100(1− α) confidenceinterval:
βj ± c · se(βj)
Lower and upper bounds of the confidence interval are:
βj ≡ βj − c · se(βj), βj ≡ βj + c · se(βj)
Econometrics I: Multiple Regression: Inference - H. Tastan 28
![Page 96: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/96.jpg)
Confidence Intervals
We know that:
tβj =βj
se(βj)∼ tn−k−1
Using this ratio we can construct the %100(1− α) confidenceinterval:
βj ± c · se(βj)
Lower and upper bounds of the confidence interval are:
βj ≡ βj − c · se(βj), βj ≡ βj + c · se(βj)
Econometrics I: Multiple Regression: Inference - H. Tastan 28
![Page 97: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/97.jpg)
Confidence Intervals
We know that:
tβj =βj
se(βj)∼ tn−k−1
Using this ratio we can construct the %100(1− α) confidenceinterval:
βj ± c · se(βj)
Lower and upper bounds of the confidence interval are:
βj ≡ βj − c · se(βj), βj ≡ βj + c · se(βj)
Econometrics I: Multiple Regression: Inference - H. Tastan 28
![Page 98: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/98.jpg)
Confidence Intervals
[βj − c · se(βj), βj + c · se(βj)]
How do we interpret confidence intervals?
If random samples were obtained over and over again andconfidence intervals are computed for each sample then theunknown population value βj would lie in the confidenceinterval for 100(1− α)% of the samples.
For example we would say 95 of the confidence intervals outof 100 would contain the true value. Note that α/2 = 0.025in this case.
In practice, we only have one sample and thus only oneconfidence interval estimate. We do not know if the estimatedconfidence interval contains the true value.
Econometrics I: Multiple Regression: Inference - H. Tastan 29
![Page 99: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/99.jpg)
Confidence Intervals
[βj − c · se(βj), βj + c · se(βj)]
How do we interpret confidence intervals?
If random samples were obtained over and over again andconfidence intervals are computed for each sample then theunknown population value βj would lie in the confidenceinterval for 100(1− α)% of the samples.
For example we would say 95 of the confidence intervals outof 100 would contain the true value. Note that α/2 = 0.025in this case.
In practice, we only have one sample and thus only oneconfidence interval estimate. We do not know if the estimatedconfidence interval contains the true value.
Econometrics I: Multiple Regression: Inference - H. Tastan 29
![Page 100: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/100.jpg)
Confidence Intervals
[βj − c · se(βj), βj + c · se(βj)]
How do we interpret confidence intervals?
If random samples were obtained over and over again andconfidence intervals are computed for each sample then theunknown population value βj would lie in the confidenceinterval for 100(1− α)% of the samples.
For example we would say 95 of the confidence intervals outof 100 would contain the true value. Note that α/2 = 0.025in this case.
In practice, we only have one sample and thus only oneconfidence interval estimate. We do not know if the estimatedconfidence interval contains the true value.
Econometrics I: Multiple Regression: Inference - H. Tastan 29
![Page 101: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/101.jpg)
Confidence Intervals
[βj − c · se(βj), βj + c · se(βj)]
How do we interpret confidence intervals?
If random samples were obtained over and over again andconfidence intervals are computed for each sample then theunknown population value βj would lie in the confidenceinterval for 100(1− α)% of the samples.
For example we would say 95 of the confidence intervals outof 100 would contain the true value. Note that α/2 = 0.025in this case.
In practice, we only have one sample and thus only oneconfidence interval estimate. We do not know if the estimatedconfidence interval contains the true value.
Econometrics I: Multiple Regression: Inference - H. Tastan 29
![Page 102: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/102.jpg)
Confidence Intervals
We need three quantities to calculate confidence intervals.coefficient estimate, standard error and critical value.
For example, for dof=25 and 95% confidence level, confidenceinterval for a population parameter can be calculated using:
[βj − 2.06 · se(βj), βj + 2.06 · se(βj)]
If n− k − 1 > 50 then 95% confidence interval can easily becalculated using βj ± 2 · se(βj).
Suppose we want to test the following hypothesis:
H0 : βj = aj
H1 : βj 6= aj
We reject H0 at the 5% significance level in favor of H1 iffthe 95% confidence interval does not contain aj .
Econometrics I: Multiple Regression: Inference - H. Tastan 30
![Page 103: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/103.jpg)
Confidence Intervals
We need three quantities to calculate confidence intervals.coefficient estimate, standard error and critical value.
For example, for dof=25 and 95% confidence level, confidenceinterval for a population parameter can be calculated using:
[βj − 2.06 · se(βj), βj + 2.06 · se(βj)]
If n− k − 1 > 50 then 95% confidence interval can easily becalculated using βj ± 2 · se(βj).
Suppose we want to test the following hypothesis:
H0 : βj = aj
H1 : βj 6= aj
We reject H0 at the 5% significance level in favor of H1 iffthe 95% confidence interval does not contain aj .
Econometrics I: Multiple Regression: Inference - H. Tastan 30
![Page 104: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/104.jpg)
Confidence Intervals
We need three quantities to calculate confidence intervals.coefficient estimate, standard error and critical value.
For example, for dof=25 and 95% confidence level, confidenceinterval for a population parameter can be calculated using:
[βj − 2.06 · se(βj), βj + 2.06 · se(βj)]
If n− k − 1 > 50 then 95% confidence interval can easily becalculated using βj ± 2 · se(βj).
Suppose we want to test the following hypothesis:
H0 : βj = aj
H1 : βj 6= aj
We reject H0 at the 5% significance level in favor of H1 iffthe 95% confidence interval does not contain aj .
Econometrics I: Multiple Regression: Inference - H. Tastan 30
![Page 105: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/105.jpg)
Confidence Intervals
We need three quantities to calculate confidence intervals.coefficient estimate, standard error and critical value.
For example, for dof=25 and 95% confidence level, confidenceinterval for a population parameter can be calculated using:
[βj − 2.06 · se(βj), βj + 2.06 · se(βj)]
If n− k − 1 > 50 then 95% confidence interval can easily becalculated using βj ± 2 · se(βj).
Suppose we want to test the following hypothesis:
H0 : βj = aj
H1 : βj 6= aj
We reject H0 at the 5% significance level in favor of H1 iffthe 95% confidence interval does not contain aj .
Econometrics I: Multiple Regression: Inference - H. Tastan 30
![Page 106: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/106.jpg)
Confidence Intervals
We need three quantities to calculate confidence intervals.coefficient estimate, standard error and critical value.
For example, for dof=25 and 95% confidence level, confidenceinterval for a population parameter can be calculated using:
[βj − 2.06 · se(βj), βj + 2.06 · se(βj)]
If n− k − 1 > 50 then 95% confidence interval can easily becalculated using βj ± 2 · se(βj).
Suppose we want to test the following hypothesis:
H0 : βj = aj
H1 : βj 6= aj
We reject H0 at the 5% significance level in favor of H1 iffthe 95% confidence interval does not contain aj .
Econometrics I: Multiple Regression: Inference - H. Tastan 30
![Page 107: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/107.jpg)
Example: Hedonic Price Model for Houses
A hedonic price model relates the price to the product’scharacteristics.
For example, in a hedonic price model for computers the priceof computers is regressed on the physical characteristics suchas CPU power, RAM size, notebook/desktop, etc.
Similarly, the value of a house is determined by severalcharacteristics: size, number of rooms, distance toemployment centers, schools and parks, crime rate in thecommunity, etc.
Dependent variable: log(price)
Explanatory variables: sqrft (square footage, size) (1 squarefoot = 0.09290304 m2, 100m2 ≈ 1076 ftsq; bdrms: numberof rooms, bthrms: number of bathrooms.
Econometrics I: Multiple Regression: Inference - H. Tastan 31
![Page 108: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/108.jpg)
Example: Hedonic Price Model for Houses
A hedonic price model relates the price to the product’scharacteristics.
For example, in a hedonic price model for computers the priceof computers is regressed on the physical characteristics suchas CPU power, RAM size, notebook/desktop, etc.
Similarly, the value of a house is determined by severalcharacteristics: size, number of rooms, distance toemployment centers, schools and parks, crime rate in thecommunity, etc.
Dependent variable: log(price)
Explanatory variables: sqrft (square footage, size) (1 squarefoot = 0.09290304 m2, 100m2 ≈ 1076 ftsq; bdrms: numberof rooms, bthrms: number of bathrooms.
Econometrics I: Multiple Regression: Inference - H. Tastan 31
![Page 109: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/109.jpg)
Example: Hedonic Price Model for Houses
A hedonic price model relates the price to the product’scharacteristics.
For example, in a hedonic price model for computers the priceof computers is regressed on the physical characteristics suchas CPU power, RAM size, notebook/desktop, etc.
Similarly, the value of a house is determined by severalcharacteristics: size, number of rooms, distance toemployment centers, schools and parks, crime rate in thecommunity, etc.
Dependent variable: log(price)
Explanatory variables: sqrft (square footage, size) (1 squarefoot = 0.09290304 m2, 100m2 ≈ 1076 ftsq; bdrms: numberof rooms, bthrms: number of bathrooms.
Econometrics I: Multiple Regression: Inference - H. Tastan 31
![Page 110: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/110.jpg)
Example: Hedonic Price Model for Houses
A hedonic price model relates the price to the product’scharacteristics.
For example, in a hedonic price model for computers the priceof computers is regressed on the physical characteristics suchas CPU power, RAM size, notebook/desktop, etc.
Similarly, the value of a house is determined by severalcharacteristics: size, number of rooms, distance toemployment centers, schools and parks, crime rate in thecommunity, etc.
Dependent variable: log(price)
Explanatory variables: sqrft (square footage, size) (1 squarefoot = 0.09290304 m2, 100m2 ≈ 1076 ftsq; bdrms: numberof rooms, bthrms: number of bathrooms.
Econometrics I: Multiple Regression: Inference - H. Tastan 31
![Page 111: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/111.jpg)
Example: Hedonic Price Model for Houses
A hedonic price model relates the price to the product’scharacteristics.
For example, in a hedonic price model for computers the priceof computers is regressed on the physical characteristics suchas CPU power, RAM size, notebook/desktop, etc.
Similarly, the value of a house is determined by severalcharacteristics: size, number of rooms, distance toemployment centers, schools and parks, crime rate in thecommunity, etc.
Dependent variable: log(price)
Explanatory variables: sqrft (square footage, size) (1 squarefoot = 0.09290304 m2, 100m2 ≈ 1076 ftsq; bdrms: numberof rooms, bthrms: number of bathrooms.
Econometrics I: Multiple Regression: Inference - H. Tastan 31
![Page 112: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/112.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
Both price and sqrft are in logs, therefore the coefficientestimate gives us elasticity: Holding bdrms and bthrms fixed,if the size of the house increases 1% then the value of thehouse is predicted to increase by 0.634%.dof=n-k-1=19-3-1=15 critical value for t15 distributionc=2.131 using α = 0.05. Thus, 95% confidence interval is
0.634± 2.131 · (0.184)⇒ [0.242, 1.026]
Because this interval does not contain zero, we reject the nullhypothesis that the population parameter is insignificant.The coefficient estimate on the number of rooms is (−). Why?
Econometrics I: Multiple Regression: Inference - H. Tastan 32
![Page 113: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/113.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
Both price and sqrft are in logs, therefore the coefficientestimate gives us elasticity: Holding bdrms and bthrms fixed,if the size of the house increases 1% then the value of thehouse is predicted to increase by 0.634%.dof=n-k-1=19-3-1=15 critical value for t15 distributionc=2.131 using α = 0.05. Thus, 95% confidence interval is
0.634± 2.131 · (0.184)⇒ [0.242, 1.026]
Because this interval does not contain zero, we reject the nullhypothesis that the population parameter is insignificant.The coefficient estimate on the number of rooms is (−). Why?
Econometrics I: Multiple Regression: Inference - H. Tastan 32
![Page 114: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/114.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
Both price and sqrft are in logs, therefore the coefficientestimate gives us elasticity: Holding bdrms and bthrms fixed,if the size of the house increases 1% then the value of thehouse is predicted to increase by 0.634%.dof=n-k-1=19-3-1=15 critical value for t15 distributionc=2.131 using α = 0.05. Thus, 95% confidence interval is
0.634± 2.131 · (0.184)⇒ [0.242, 1.026]
Because this interval does not contain zero, we reject the nullhypothesis that the population parameter is insignificant.The coefficient estimate on the number of rooms is (−). Why?
Econometrics I: Multiple Regression: Inference - H. Tastan 32
![Page 115: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/115.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
Both price and sqrft are in logs, therefore the coefficientestimate gives us elasticity: Holding bdrms and bthrms fixed,if the size of the house increases 1% then the value of thehouse is predicted to increase by 0.634%.dof=n-k-1=19-3-1=15 critical value for t15 distributionc=2.131 using α = 0.05. Thus, 95% confidence interval is
0.634± 2.131 · (0.184)⇒ [0.242, 1.026]
Because this interval does not contain zero, we reject the nullhypothesis that the population parameter is insignificant.The coefficient estimate on the number of rooms is (−). Why?
Econometrics I: Multiple Regression: Inference - H. Tastan 32
![Page 116: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/116.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
Both price and sqrft are in logs, therefore the coefficientestimate gives us elasticity: Holding bdrms and bthrms fixed,if the size of the house increases 1% then the value of thehouse is predicted to increase by 0.634%.dof=n-k-1=19-3-1=15 critical value for t15 distributionc=2.131 using α = 0.05. Thus, 95% confidence interval is
0.634± 2.131 · (0.184)⇒ [0.242, 1.026]
Because this interval does not contain zero, we reject the nullhypothesis that the population parameter is insignificant.The coefficient estimate on the number of rooms is (−). Why?
Econometrics I: Multiple Regression: Inference - H. Tastan 32
![Page 117: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/117.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
95% confidence interval for βbdrms is [−0.192, 0.006].This interval contains 0. Thus, its effect is statisticallyinsignificant.
Interpretation of coefficient estimate on bthrms: ceterisparibus, if the number of bathrooms increases by 1, houseprices are predicted to increase by approximately100(0.158)%=15.8% on average.
95% confidence interval is [−0.002, 0.318]. Technically thisinterval does not contain zero but the lower confidence limit isclose to zero. It is better to compute p-value.
Econometrics I: Multiple Regression: Inference - H. Tastan 33
![Page 118: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/118.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
95% confidence interval for βbdrms is [−0.192, 0.006].This interval contains 0. Thus, its effect is statisticallyinsignificant.
Interpretation of coefficient estimate on bthrms: ceterisparibus, if the number of bathrooms increases by 1, houseprices are predicted to increase by approximately100(0.158)%=15.8% on average.
95% confidence interval is [−0.002, 0.318]. Technically thisinterval does not contain zero but the lower confidence limit isclose to zero. It is better to compute p-value.
Econometrics I: Multiple Regression: Inference - H. Tastan 33
![Page 119: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/119.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
95% confidence interval for βbdrms is [−0.192, 0.006].This interval contains 0. Thus, its effect is statisticallyinsignificant.
Interpretation of coefficient estimate on bthrms: ceterisparibus, if the number of bathrooms increases by 1, houseprices are predicted to increase by approximately100(0.158)%=15.8% on average.
95% confidence interval is [−0.002, 0.318]. Technically thisinterval does not contain zero but the lower confidence limit isclose to zero. It is better to compute p-value.
Econometrics I: Multiple Regression: Inference - H. Tastan 33
![Page 120: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/120.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
95% confidence interval for βbdrms is [−0.192, 0.006].This interval contains 0. Thus, its effect is statisticallyinsignificant.
Interpretation of coefficient estimate on bthrms: ceterisparibus, if the number of bathrooms increases by 1, houseprices are predicted to increase by approximately100(0.158)%=15.8% on average.
95% confidence interval is [−0.002, 0.318]. Technically thisinterval does not contain zero but the lower confidence limit isclose to zero. It is better to compute p-value.
Econometrics I: Multiple Regression: Inference - H. Tastan 33
![Page 121: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/121.jpg)
Example: Hedonic Price Model for Houses
Estimation Results
log(price) = 7.46(1.15)
+ 0.634(0.184)
log(sqrft)− 0.066(0.059)
bdrms + 0.158(0.075)
bthrms
n = 19 R2 = 0.806
95% confidence interval for βbdrms is [−0.192, 0.006].This interval contains 0. Thus, its effect is statisticallyinsignificant.
Interpretation of coefficient estimate on bthrms: ceterisparibus, if the number of bathrooms increases by 1, houseprices are predicted to increase by approximately100(0.158)%=15.8% on average.
95% confidence interval is [−0.002, 0.318]. Technically thisinterval does not contain zero but the lower confidence limit isclose to zero. It is better to compute p-value.
Econometrics I: Multiple Regression: Inference - H. Tastan 33
![Page 122: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/122.jpg)
Testing Hypotheses about a Single Linear Combination
Is one year at a junior college (2-year higher education) worthone year at a university (4-year)?
log(wage) = β0 + β1jc+ β2univ + β3exper + u
jc: number of years attending a junior college, univ: number ofyears at a 4-year college, exper: experience (year)
Null hypothesis:
H0 : β1 = β2 ⇔ β1 − β2 = 0
Alternative hypothesis
H0 : β1 < β2 ⇔ β1 − β2 < 0
Econometrics I: Multiple Regression: Inference - H. Tastan 34
![Page 123: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/123.jpg)
Testing Hypotheses about a Single Linear Combination
Is one year at a junior college (2-year higher education) worthone year at a university (4-year)?
log(wage) = β0 + β1jc+ β2univ + β3exper + u
jc: number of years attending a junior college, univ: number ofyears at a 4-year college, exper: experience (year)
Null hypothesis:
H0 : β1 = β2 ⇔ β1 − β2 = 0
Alternative hypothesis
H0 : β1 < β2 ⇔ β1 − β2 < 0
Econometrics I: Multiple Regression: Inference - H. Tastan 34
![Page 124: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/124.jpg)
Testing Hypotheses about a Single Linear Combination
Is one year at a junior college (2-year higher education) worthone year at a university (4-year)?
log(wage) = β0 + β1jc+ β2univ + β3exper + u
jc: number of years attending a junior college, univ: number ofyears at a 4-year college, exper: experience (year)
Null hypothesis:
H0 : β1 = β2 ⇔ β1 − β2 = 0
Alternative hypothesis
H0 : β1 < β2 ⇔ β1 − β2 < 0
Econometrics I: Multiple Regression: Inference - H. Tastan 34
![Page 125: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/125.jpg)
Testing Hypotheses about a Single Linear Combination
Since the null hypothesis contains a single linear combinationwe can use t test:
t =β1 − β2
se(β1 − β2)The standard error is given by:
se(β1 − β2) =√
Var(β1 − β2)
Var(β1 − β2) = Var(β1) + Var(β2)− 2Cov(β1, β2)
To compute this we need to know the covariances betweenOLS estimates.
Econometrics I: Multiple Regression: Inference - H. Tastan 35
![Page 126: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/126.jpg)
Testing Hypotheses about a Single Linear Combination
Since the null hypothesis contains a single linear combinationwe can use t test:
t =β1 − β2
se(β1 − β2)The standard error is given by:
se(β1 − β2) =√
Var(β1 − β2)
Var(β1 − β2) = Var(β1) + Var(β2)− 2Cov(β1, β2)
To compute this we need to know the covariances betweenOLS estimates.
Econometrics I: Multiple Regression: Inference - H. Tastan 35
![Page 127: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/127.jpg)
Testing Hypotheses about a Single Linear Combination
Since the null hypothesis contains a single linear combinationwe can use t test:
t =β1 − β2
se(β1 − β2)The standard error is given by:
se(β1 − β2) =√
Var(β1 − β2)
Var(β1 − β2) = Var(β1) + Var(β2)− 2Cov(β1, β2)
To compute this we need to know the covariances betweenOLS estimates.
Econometrics I: Multiple Regression: Inference - H. Tastan 35
![Page 128: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/128.jpg)
Testing Hypotheses about a Single Linear Combination
An alternative method to compute se(β1 − β2) is to estimatere-arranged regression.
Let θ = β1 − β2. Now the null and alternative hypotheses are:
H0 : θ = 0, H1 : θ < 0
Substituting β1 = θ + β2 into the model we obtain:
y = β0 + (θ + β2)x1 + β2x2 + β3x3 + u
= β0 + θx1 + β2(x1 + x2) + β3x3 + u
Econometrics I: Multiple Regression: Inference - H. Tastan 36
![Page 129: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/129.jpg)
Testing Hypotheses about a Single Linear Combination
An alternative method to compute se(β1 − β2) is to estimatere-arranged regression.
Let θ = β1 − β2. Now the null and alternative hypotheses are:
H0 : θ = 0, H1 : θ < 0
Substituting β1 = θ + β2 into the model we obtain:
y = β0 + (θ + β2)x1 + β2x2 + β3x3 + u
= β0 + θx1 + β2(x1 + x2) + β3x3 + u
Econometrics I: Multiple Regression: Inference - H. Tastan 36
![Page 130: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/130.jpg)
Testing Hypotheses about a Single Linear Combination
An alternative method to compute se(β1 − β2) is to estimatere-arranged regression.
Let θ = β1 − β2. Now the null and alternative hypotheses are:
H0 : θ = 0, H1 : θ < 0
Substituting β1 = θ + β2 into the model we obtain:
y = β0 + (θ + β2)x1 + β2x2 + β3x3 + u
= β0 + θx1 + β2(x1 + x2) + β3x3 + u
Econometrics I: Multiple Regression: Inference - H. Tastan 36
![Page 131: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/131.jpg)
Example: twoyear.gdt
Estimation Results
log(wage) = 1.43(0.27)
+ 0.098(0.031)
jc + 0.124(0.035)
univ + 0.019(0.008)
exper
n = 285 R2 = 0.243
Computing se
log(wage) = 1.43(0.27)
− 0.026(0.018)
jc + 0.124(0.035)
totcoll + 0.019(0.008)
exper
n = 285 R2 = 0.243
Note: totcoll = jc+ univ. se(θ) = se(β1 − β2) = 0.018.t statistic: t = −0.026/0.018 = −1.44, p-value= 0.075There is some but not strong evidence against H0. The returnon an additional year of eduction at a 4-year college isstatistically larger than than the return on an additional yearat a 2-year college.
Econometrics I: Multiple Regression: Inference - H. Tastan 37
![Page 132: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/132.jpg)
Testing Multiple Linear Restrictions: the F Test
The t statistic can be used to test whether an unknownpopulation parameter is equal to a given constant.
It can also be used to test a single linear combination onpopulation parameters as we just saw.
In practice, we would like to test multiple hypotheses aboutthe population parameters.
We will use the F test for this purpose.
Econometrics I: Multiple Regression: Inference - H. Tastan 38
![Page 133: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/133.jpg)
Testing Multiple Linear Restrictions: the F Test
The t statistic can be used to test whether an unknownpopulation parameter is equal to a given constant.
It can also be used to test a single linear combination onpopulation parameters as we just saw.
In practice, we would like to test multiple hypotheses aboutthe population parameters.
We will use the F test for this purpose.
Econometrics I: Multiple Regression: Inference - H. Tastan 38
![Page 134: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/134.jpg)
Testing Multiple Linear Restrictions: the F Test
The t statistic can be used to test whether an unknownpopulation parameter is equal to a given constant.
It can also be used to test a single linear combination onpopulation parameters as we just saw.
In practice, we would like to test multiple hypotheses aboutthe population parameters.
We will use the F test for this purpose.
Econometrics I: Multiple Regression: Inference - H. Tastan 38
![Page 135: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/135.jpg)
Testing Multiple Linear Restrictions: the F Test
The t statistic can be used to test whether an unknownpopulation parameter is equal to a given constant.
It can also be used to test a single linear combination onpopulation parameters as we just saw.
In practice, we would like to test multiple hypotheses aboutthe population parameters.
We will use the F test for this purpose.
Econometrics I: Multiple Regression: Inference - H. Tastan 38
![Page 136: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/136.jpg)
Exclusion Restrictions
We want to test whether a group of variables has no effect onthe dependent variable.
For example, in the following model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
we want to test
H0 : β3 = 0, β4 = 0, β5 = 0
H1 : β3 6= 0, β4 6= 0, β5 6= 0
The null hypothesis states that x3, x4 and x5 together haveno effect on y after controlling for x1 and x2.
H0 puts 3 exclusion restrictions on the model.
The alternative holds if at least one of β3, β4 or β5 is differentfrom zero.
Econometrics I: Multiple Regression: Inference - H. Tastan 39
![Page 137: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/137.jpg)
Exclusion Restrictions
We want to test whether a group of variables has no effect onthe dependent variable.
For example, in the following model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
we want to test
H0 : β3 = 0, β4 = 0, β5 = 0
H1 : β3 6= 0, β4 6= 0, β5 6= 0
The null hypothesis states that x3, x4 and x5 together haveno effect on y after controlling for x1 and x2.
H0 puts 3 exclusion restrictions on the model.
The alternative holds if at least one of β3, β4 or β5 is differentfrom zero.
Econometrics I: Multiple Regression: Inference - H. Tastan 39
![Page 138: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/138.jpg)
Exclusion Restrictions
We want to test whether a group of variables has no effect onthe dependent variable.
For example, in the following model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
we want to test
H0 : β3 = 0, β4 = 0, β5 = 0
H1 : β3 6= 0, β4 6= 0, β5 6= 0
The null hypothesis states that x3, x4 and x5 together haveno effect on y after controlling for x1 and x2.
H0 puts 3 exclusion restrictions on the model.
The alternative holds if at least one of β3, β4 or β5 is differentfrom zero.
Econometrics I: Multiple Regression: Inference - H. Tastan 39
![Page 139: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/139.jpg)
Exclusion Restrictions
We want to test whether a group of variables has no effect onthe dependent variable.
For example, in the following model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
we want to test
H0 : β3 = 0, β4 = 0, β5 = 0
H1 : β3 6= 0, β4 6= 0, β5 6= 0
The null hypothesis states that x3, x4 and x5 together haveno effect on y after controlling for x1 and x2.
H0 puts 3 exclusion restrictions on the model.
The alternative holds if at least one of β3, β4 or β5 is differentfrom zero.
Econometrics I: Multiple Regression: Inference - H. Tastan 39
![Page 140: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/140.jpg)
Exclusion Restrictions
We want to test whether a group of variables has no effect onthe dependent variable.
For example, in the following model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
we want to test
H0 : β3 = 0, β4 = 0, β5 = 0
H1 : β3 6= 0, β4 6= 0, β5 6= 0
The null hypothesis states that x3, x4 and x5 together haveno effect on y after controlling for x1 and x2.
H0 puts 3 exclusion restrictions on the model.
The alternative holds if at least one of β3, β4 or β5 is differentfrom zero.
Econometrics I: Multiple Regression: Inference - H. Tastan 39
![Page 141: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/141.jpg)
Exclusion Restrictions
UnRestricted Model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
SSRur, R2ur
Restricted Model
y = β0 + β1x1 + β2x2 + u
SSRr, R2r
The restricted model is obtained under H0.
We can estimate both models separately and compare SSRsusing the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 40
![Page 142: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/142.jpg)
Exclusion Restrictions
UnRestricted Model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
SSRur, R2ur
Restricted Model
y = β0 + β1x1 + β2x2 + u
SSRr, R2r
The restricted model is obtained under H0.
We can estimate both models separately and compare SSRsusing the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 40
![Page 143: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/143.jpg)
Exclusion Restrictions
UnRestricted Model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
SSRur, R2ur
Restricted Model
y = β0 + β1x1 + β2x2 + u
SSRr, R2r
The restricted model is obtained under H0.
We can estimate both models separately and compare SSRsusing the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 40
![Page 144: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/144.jpg)
Exclusion Restrictions
UnRestricted Model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
SSRur, R2ur
Restricted Model
y = β0 + β1x1 + β2x2 + u
SSRr, R2r
The restricted model is obtained under H0.
We can estimate both models separately and compare SSRsusing the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 40
![Page 145: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/145.jpg)
Testing Multiple Linear Restrictions
The F -test statistic
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
∼ Fk,n−k−1
SSRr: restricted model’s SSR, SSRur: unrestricted model’sSSR.
q = dfr − dfur: total number of restrictions, also the degreesof freedom for the numerator.
The degrees of freedom for denominator:dfur obtained fromthe unrestricted model.
Decision rule: If F > c REJECT H0 RED. The critical value c,is obtained from the Fk,n−k−1 distribution using 100α%significance level.
Econometrics I: Multiple Regression: Inference - H. Tastan 41
![Page 146: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/146.jpg)
Testing Multiple Linear Restrictions
The F -test statistic
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
∼ Fk,n−k−1
SSRr: restricted model’s SSR, SSRur: unrestricted model’sSSR.
q = dfr − dfur: total number of restrictions, also the degreesof freedom for the numerator.
The degrees of freedom for denominator:dfur obtained fromthe unrestricted model.
Decision rule: If F > c REJECT H0 RED. The critical value c,is obtained from the Fk,n−k−1 distribution using 100α%significance level.
Econometrics I: Multiple Regression: Inference - H. Tastan 41
![Page 147: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/147.jpg)
Testing Multiple Linear Restrictions
The F -test statistic
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
∼ Fk,n−k−1
SSRr: restricted model’s SSR, SSRur: unrestricted model’sSSR.
q = dfr − dfur: total number of restrictions, also the degreesof freedom for the numerator.
The degrees of freedom for denominator:dfur obtained fromthe unrestricted model.
Decision rule: If F > c REJECT H0 RED. The critical value c,is obtained from the Fk,n−k−1 distribution using 100α%significance level.
Econometrics I: Multiple Regression: Inference - H. Tastan 41
![Page 148: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/148.jpg)
Testing Multiple Linear Restrictions
The F -test statistic
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
∼ Fk,n−k−1
SSRr: restricted model’s SSR, SSRur: unrestricted model’sSSR.
q = dfr − dfur: total number of restrictions, also the degreesof freedom for the numerator.
The degrees of freedom for denominator:dfur obtained fromthe unrestricted model.
Decision rule: If F > c REJECT H0 RED. The critical value c,is obtained from the Fk,n−k−1 distribution using 100α%significance level.
Econometrics I: Multiple Regression: Inference - H. Tastan 41
![Page 149: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/149.jpg)
Testing Multiple Linear Restrictions
The F -test statistic
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
∼ Fk,n−k−1
SSRr: restricted model’s SSR, SSRur: unrestricted model’sSSR.
q = dfr − dfur: total number of restrictions, also the degreesof freedom for the numerator.
The degrees of freedom for denominator:dfur obtained fromthe unrestricted model.
Decision rule: If F > c REJECT H0 RED. The critical value c,is obtained from the Fk,n−k−1 distribution using 100α%significance level.
Econometrics I: Multiple Regression: Inference - H. Tastan 41
![Page 150: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/150.jpg)
5% Rejection Region for the F (3, 60) Distribution
![Page 151: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/151.jpg)
The F Test
The F test for exclusion restrictions can be useful when thevariables in the group are highly correlated.
For example, suppose we want to test whether firmperformance affect salaries of CEOs. Since there are severalmeasures of firm performance using all of these variables inthe model may lead to multicollinearity problem.
In this case individual t tests may not be helpful. Thestandard errors will be high due to multicollinearity.
But F test can be used to determine whether as a group thefirm performance variables affect salary.
Econometrics I: Multiple Regression: Inference - H. Tastan 43
![Page 152: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/152.jpg)
The F Test
The F test for exclusion restrictions can be useful when thevariables in the group are highly correlated.
For example, suppose we want to test whether firmperformance affect salaries of CEOs. Since there are severalmeasures of firm performance using all of these variables inthe model may lead to multicollinearity problem.
In this case individual t tests may not be helpful. Thestandard errors will be high due to multicollinearity.
But F test can be used to determine whether as a group thefirm performance variables affect salary.
Econometrics I: Multiple Regression: Inference - H. Tastan 43
![Page 153: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/153.jpg)
The F Test
The F test for exclusion restrictions can be useful when thevariables in the group are highly correlated.
For example, suppose we want to test whether firmperformance affect salaries of CEOs. Since there are severalmeasures of firm performance using all of these variables inthe model may lead to multicollinearity problem.
In this case individual t tests may not be helpful. Thestandard errors will be high due to multicollinearity.
But F test can be used to determine whether as a group thefirm performance variables affect salary.
Econometrics I: Multiple Regression: Inference - H. Tastan 43
![Page 154: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/154.jpg)
The F Test
The F test for exclusion restrictions can be useful when thevariables in the group are highly correlated.
For example, suppose we want to test whether firmperformance affect salaries of CEOs. Since there are severalmeasures of firm performance using all of these variables inthe model may lead to multicollinearity problem.
In this case individual t tests may not be helpful. Thestandard errors will be high due to multicollinearity.
But F test can be used to determine whether as a group thefirm performance variables affect salary.
Econometrics I: Multiple Regression: Inference - H. Tastan 43
![Page 155: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/155.jpg)
Relationship between t and F Statistics
Conducting an F test on a single parameter gives the sameresult as the t test.
For the two-sided test of H0 : βj = 0 the F test statistic hasq = 1 degrees of freedom for the numerator and the followingrelationship holds:
t2 = F
For two-sided alternatives:
t2n−k−1 ∼ F (1, n− k − 1)
But in testing hypotheses using a single parameter t test iseasier and more flexible and also allows for one-sidedalternatives.
Econometrics I: Multiple Regression: Inference - H. Tastan 44
![Page 156: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/156.jpg)
Relationship between t and F Statistics
Conducting an F test on a single parameter gives the sameresult as the t test.
For the two-sided test of H0 : βj = 0 the F test statistic hasq = 1 degrees of freedom for the numerator and the followingrelationship holds:
t2 = F
For two-sided alternatives:
t2n−k−1 ∼ F (1, n− k − 1)
But in testing hypotheses using a single parameter t test iseasier and more flexible and also allows for one-sidedalternatives.
Econometrics I: Multiple Regression: Inference - H. Tastan 44
![Page 157: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/157.jpg)
Relationship between t and F Statistics
Conducting an F test on a single parameter gives the sameresult as the t test.
For the two-sided test of H0 : βj = 0 the F test statistic hasq = 1 degrees of freedom for the numerator and the followingrelationship holds:
t2 = F
For two-sided alternatives:
t2n−k−1 ∼ F (1, n− k − 1)
But in testing hypotheses using a single parameter t test iseasier and more flexible and also allows for one-sidedalternatives.
Econometrics I: Multiple Regression: Inference - H. Tastan 44
![Page 158: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/158.jpg)
Relationship between t and F Statistics
Conducting an F test on a single parameter gives the sameresult as the t test.
For the two-sided test of H0 : βj = 0 the F test statistic hasq = 1 degrees of freedom for the numerator and the followingrelationship holds:
t2 = F
For two-sided alternatives:
t2n−k−1 ∼ F (1, n− k − 1)
But in testing hypotheses using a single parameter t test iseasier and more flexible and also allows for one-sidedalternatives.
Econometrics I: Multiple Regression: Inference - H. Tastan 44
![Page 159: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/159.jpg)
R2 Form of the F Statistic
The F test statistic can be written in terms of R2s from therestricted and unrestricted models instead of SSRs.
Recall that
SSRr = SST (1−R2r), SSRur = SST (1−R2
ur)
Substituting into the F statistic an rearranging we obtain:
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
R2ur: coefficient of determination from the unrestricted model,
R2r : coefficient of determination from the restricted model
R2ur ≥ R2
r
Econometrics I: Multiple Regression: Inference - H. Tastan 45
![Page 160: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/160.jpg)
R2 Form of the F Statistic
The F test statistic can be written in terms of R2s from therestricted and unrestricted models instead of SSRs.
Recall that
SSRr = SST (1−R2r), SSRur = SST (1−R2
ur)
Substituting into the F statistic an rearranging we obtain:
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
R2ur: coefficient of determination from the unrestricted model,
R2r : coefficient of determination from the restricted model
R2ur ≥ R2
r
Econometrics I: Multiple Regression: Inference - H. Tastan 45
![Page 161: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/161.jpg)
R2 Form of the F Statistic
The F test statistic can be written in terms of R2s from therestricted and unrestricted models instead of SSRs.
Recall that
SSRr = SST (1−R2r), SSRur = SST (1−R2
ur)
Substituting into the F statistic an rearranging we obtain:
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
R2ur: coefficient of determination from the unrestricted model,
R2r : coefficient of determination from the restricted model
R2ur ≥ R2
r
Econometrics I: Multiple Regression: Inference - H. Tastan 45
![Page 162: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/162.jpg)
R2 Form of the F Statistic
The F test statistic can be written in terms of R2s from therestricted and unrestricted models instead of SSRs.
Recall that
SSRr = SST (1−R2r), SSRur = SST (1−R2
ur)
Substituting into the F statistic an rearranging we obtain:
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
R2ur: coefficient of determination from the unrestricted model,
R2r : coefficient of determination from the restricted model
R2ur ≥ R2
r
Econometrics I: Multiple Regression: Inference - H. Tastan 45
![Page 163: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/163.jpg)
R2 Form of the F Statistic
The F test statistic can be written in terms of R2s from therestricted and unrestricted models instead of SSRs.
Recall that
SSRr = SST (1−R2r), SSRur = SST (1−R2
ur)
Substituting into the F statistic an rearranging we obtain:
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
R2ur: coefficient of determination from the unrestricted model,
R2r : coefficient of determination from the restricted model
R2ur ≥ R2
r
Econometrics I: Multiple Regression: Inference - H. Tastan 45
![Page 164: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/164.jpg)
R2 Form of the F Statistic
The F test statistic can be written in terms of R2s from therestricted and unrestricted models instead of SSRs.
Recall that
SSRr = SST (1−R2r), SSRur = SST (1−R2
ur)
Substituting into the F statistic an rearranging we obtain:
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
R2ur: coefficient of determination from the unrestricted model,
R2r : coefficient of determination from the restricted model
R2ur ≥ R2
r
Econometrics I: Multiple Regression: Inference - H. Tastan 45
![Page 165: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/165.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 166: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/166.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 167: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/167.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 168: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/168.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 169: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/169.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 170: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/170.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 171: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/171.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 172: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/172.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 173: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/173.jpg)
F Test: Example
Parents’ education in a birth-weight model: bwght.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + β5x5 + u
Dependent variable: y = birth weight of newly born babies, inpounds
Explanatory variables:
x1: average number of cigarettes the mother smoked per dayduring pregnancy,x2: the birth order of this child,x3: annual family income,x4: years of schooling for the mother,x5: years of schooling for the father.
We want to test: H0 : β4 = 0, β5 = 0, parents’ education hasno effect on birth weight, ceteris paribus.
Econometrics I: Multiple Regression: Inference - H. Tastan 46
![Page 174: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/174.jpg)
Unrestricted Model: bwght.gdt
Model 1: OLS, using observations 1–1388 (n = 1191)Missing or incomplete observations dropped: 197Dependent variable: bwght
Coefficient Std. Error t-ratio p-value
const 114.524 3.72845 30.7163 0.0000cigs −0.595936 0.110348 −5.4005 0.0000parity 1.78760 0.659406 2.7109 0.0068faminc 0.0560414 0.0365616 1.5328 0.1256motheduc −0.370450 0.319855 −1.1582 0.2470fatheduc 0.472394 0.282643 1.6713 0.0949
Mean dependent var 119.5298 S.D. dependent var 20.14124Sum squared resid SSRur 464041.1 S.E. of regression 19.78878R2ur 0.038748 Adjusted R2 0.034692
Econometrics I: Multiple Regression: Inference - H. Tastan 47
![Page 175: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/175.jpg)
Restricted Model: bwght.gdt
Model 2: OLS, using observations 1–1191Dependent variable: bwght
Coefficient Std. Error t-ratio p-value
const 115.470 1.65590 69.7325 0.0000cigs −0.597852 0.108770 −5.4965 0.0000parity 1.83227 0.657540 2.7866 0.0054faminc 0.0670618 0.0323938 2.0702 0.0386
Mean dependent var 119.5298 S.D. dependent var 20.14124Sum squared resid SSRr 465166.8 S.E. of regression 19.79607R2r 0.036416 Adjusted R2 0.033981
Econometrics I: Multiple Regression: Inference - H. Tastan 48
![Page 176: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/176.jpg)
Example: continued
F statistic in SSR form
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
=(465167− 464041)/2
464041/(1191− 5− 1)= 1.4377
F statistic in R2 form
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
=(0.0387− 0.0364)/2
(1− 0.0387)/1185= 1.4376
Critical value: F(2, 1185) distribution at 5% level, c = 3, at10% level c = 2.3
Decision: We fail to reject H0 at these significance levels.Parents’ education has no effect on birth weights. They arejointly insignificant.
Econometrics I: Multiple Regression: Inference - H. Tastan 49
![Page 177: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/177.jpg)
Example: continued
F statistic in SSR form
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
=(465167− 464041)/2
464041/(1191− 5− 1)= 1.4377
F statistic in R2 form
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
=(0.0387− 0.0364)/2
(1− 0.0387)/1185= 1.4376
Critical value: F(2, 1185) distribution at 5% level, c = 3, at10% level c = 2.3
Decision: We fail to reject H0 at these significance levels.Parents’ education has no effect on birth weights. They arejointly insignificant.
Econometrics I: Multiple Regression: Inference - H. Tastan 49
![Page 178: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/178.jpg)
Example: continued
F statistic in SSR form
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
=(465167− 464041)/2
464041/(1191− 5− 1)= 1.4377
F statistic in R2 form
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
=(0.0387− 0.0364)/2
(1− 0.0387)/1185= 1.4376
Critical value: F(2, 1185) distribution at 5% level, c = 3, at10% level c = 2.3
Decision: We fail to reject H0 at these significance levels.Parents’ education has no effect on birth weights. They arejointly insignificant.
Econometrics I: Multiple Regression: Inference - H. Tastan 49
![Page 179: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/179.jpg)
Example: continued
F statistic in SSR form
F =(SSRr − SSRur)/qSSRur/(n− k − 1)
=(465167− 464041)/2
464041/(1191− 5− 1)= 1.4377
F statistic in R2 form
F =(R2
ur −R2r)/q
(1−R2ur)/(n− k − 1)
=(0.0387− 0.0364)/2
(1− 0.0387)/1185= 1.4376
Critical value: F(2, 1185) distribution at 5% level, c = 3, at10% level c = 2.3
Decision: We fail to reject H0 at these significance levels.Parents’ education has no effect on birth weights. They arejointly insignificant.
Econometrics I: Multiple Regression: Inference - H. Tastan 49
![Page 180: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/180.jpg)
Overall Significance of a Regression
We want to test the following hypothesis:
H0 : β1 = β2 = . . . = βk = 0
None of the explanatory variables has an effect on y. In otherwords they are jointly insignificant.
Alternative hypothesis states that at least one of them isdifferent from zero.
According to the null the model has no explanatory power.Under the null hypothesis we obtain the following model
y = β0 + u
This hypothesis can be tested using the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 50
![Page 181: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/181.jpg)
Overall Significance of a Regression
We want to test the following hypothesis:
H0 : β1 = β2 = . . . = βk = 0
None of the explanatory variables has an effect on y. In otherwords they are jointly insignificant.
Alternative hypothesis states that at least one of them isdifferent from zero.
According to the null the model has no explanatory power.Under the null hypothesis we obtain the following model
y = β0 + u
This hypothesis can be tested using the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 50
![Page 182: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/182.jpg)
Overall Significance of a Regression
We want to test the following hypothesis:
H0 : β1 = β2 = . . . = βk = 0
None of the explanatory variables has an effect on y. In otherwords they are jointly insignificant.
Alternative hypothesis states that at least one of them isdifferent from zero.
According to the null the model has no explanatory power.Under the null hypothesis we obtain the following model
y = β0 + u
This hypothesis can be tested using the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 50
![Page 183: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/183.jpg)
Overall Significance of a Regression
We want to test the following hypothesis:
H0 : β1 = β2 = . . . = βk = 0
None of the explanatory variables has an effect on y. In otherwords they are jointly insignificant.
Alternative hypothesis states that at least one of them isdifferent from zero.
According to the null the model has no explanatory power.Under the null hypothesis we obtain the following model
y = β0 + u
This hypothesis can be tested using the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 50
![Page 184: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/184.jpg)
Overall Significance of a Regression
We want to test the following hypothesis:
H0 : β1 = β2 = . . . = βk = 0
None of the explanatory variables has an effect on y. In otherwords they are jointly insignificant.
Alternative hypothesis states that at least one of them isdifferent from zero.
According to the null the model has no explanatory power.Under the null hypothesis we obtain the following model
y = β0 + u
This hypothesis can be tested using the F statistic.
Econometrics I: Multiple Regression: Inference - H. Tastan 50
![Page 185: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/185.jpg)
Overall Significance of a Regression
The F test statistic is
F =R2/k
(1−R2)/(n− k − 1)∼ Fk,n−k−1
The R2 is just the usual coefficient of determination from theunrestricted model.
Standard econometrics software packages routinely computeand report this statistic.
In the previous example
F − statistic(5, 1185) = 9.5535(p− value < 0.00001)
p-value is very small. It says that if we reject H0 theprobability of Type I Error will be very small. Thus, the null isrejected very strongly.
There is strong evidence against the null hypothesis whichstates that the variables are jointly insignificant. Theregression is overall significant.
Econometrics I: Multiple Regression: Inference - H. Tastan 51
![Page 186: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/186.jpg)
Overall Significance of a Regression
The F test statistic is
F =R2/k
(1−R2)/(n− k − 1)∼ Fk,n−k−1
The R2 is just the usual coefficient of determination from theunrestricted model.
Standard econometrics software packages routinely computeand report this statistic.
In the previous example
F − statistic(5, 1185) = 9.5535(p− value < 0.00001)
p-value is very small. It says that if we reject H0 theprobability of Type I Error will be very small. Thus, the null isrejected very strongly.
There is strong evidence against the null hypothesis whichstates that the variables are jointly insignificant. Theregression is overall significant.
Econometrics I: Multiple Regression: Inference - H. Tastan 51
![Page 187: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/187.jpg)
Overall Significance of a Regression
The F test statistic is
F =R2/k
(1−R2)/(n− k − 1)∼ Fk,n−k−1
The R2 is just the usual coefficient of determination from theunrestricted model.
Standard econometrics software packages routinely computeand report this statistic.
In the previous example
F − statistic(5, 1185) = 9.5535(p− value < 0.00001)
p-value is very small. It says that if we reject H0 theprobability of Type I Error will be very small. Thus, the null isrejected very strongly.
There is strong evidence against the null hypothesis whichstates that the variables are jointly insignificant. Theregression is overall significant.
Econometrics I: Multiple Regression: Inference - H. Tastan 51
![Page 188: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/188.jpg)
Overall Significance of a Regression
The F test statistic is
F =R2/k
(1−R2)/(n− k − 1)∼ Fk,n−k−1
The R2 is just the usual coefficient of determination from theunrestricted model.
Standard econometrics software packages routinely computeand report this statistic.
In the previous example
F − statistic(5, 1185) = 9.5535(p− value < 0.00001)
p-value is very small. It says that if we reject H0 theprobability of Type I Error will be very small. Thus, the null isrejected very strongly.
There is strong evidence against the null hypothesis whichstates that the variables are jointly insignificant. Theregression is overall significant.
Econometrics I: Multiple Regression: Inference - H. Tastan 51
![Page 189: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/189.jpg)
Overall Significance of a Regression
The F test statistic is
F =R2/k
(1−R2)/(n− k − 1)∼ Fk,n−k−1
The R2 is just the usual coefficient of determination from theunrestricted model.
Standard econometrics software packages routinely computeand report this statistic.
In the previous example
F − statistic(5, 1185) = 9.5535(p− value < 0.00001)
p-value is very small. It says that if we reject H0 theprobability of Type I Error will be very small. Thus, the null isrejected very strongly.
There is strong evidence against the null hypothesis whichstates that the variables are jointly insignificant. Theregression is overall significant.
Econometrics I: Multiple Regression: Inference - H. Tastan 51
![Page 190: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/190.jpg)
Overall Significance of a Regression
The F test statistic is
F =R2/k
(1−R2)/(n− k − 1)∼ Fk,n−k−1
The R2 is just the usual coefficient of determination from theunrestricted model.
Standard econometrics software packages routinely computeand report this statistic.
In the previous example
F − statistic(5, 1185) = 9.5535(p− value < 0.00001)
p-value is very small. It says that if we reject H0 theprobability of Type I Error will be very small. Thus, the null isrejected very strongly.
There is strong evidence against the null hypothesis whichstates that the variables are jointly insignificant. Theregression is overall significant.
Econometrics I: Multiple Regression: Inference - H. Tastan 51
![Page 191: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/191.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 192: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/192.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 193: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/193.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 194: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/194.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 195: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/195.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 196: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/196.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 197: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/197.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 198: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/198.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 199: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/199.jpg)
Testing General Linear Restrictions
Example: Rationality of housing valuations: hprice1.gdt
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Dependent variable: y = log(price)
Explanatory variables:
x1: log(assess), the assessed housing value (before the housewas sold)x2: log(lotsize), size of the lot, in feet.x3: log(sqrft), size of the house.x4: bdrms, number of bedrooms
We are interested in testingH0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
The null hypothesis states that additional characteristics donot explain house prices once we controlled for the housevaluations.
Econometrics I: Multiple Regression: Inference - H. Tastan 52
![Page 200: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/200.jpg)
Example: Rationality of housing valuations
Unrestricted model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Restricted model under H0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
y = β0 + x1 + u
Restricted model can be estimated using
y − x1 = β0 + u
The steps of the F test are the same.
Econometrics I: Multiple Regression: Inference - H. Tastan 53
![Page 201: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/201.jpg)
Example: Rationality of housing valuations
Unrestricted model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Restricted model under H0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
y = β0 + x1 + u
Restricted model can be estimated using
y − x1 = β0 + u
The steps of the F test are the same.
Econometrics I: Multiple Regression: Inference - H. Tastan 53
![Page 202: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/202.jpg)
Example: Rationality of housing valuations
Unrestricted model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Restricted model under H0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
y = β0 + x1 + u
Restricted model can be estimated using
y − x1 = β0 + u
The steps of the F test are the same.
Econometrics I: Multiple Regression: Inference - H. Tastan 53
![Page 203: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/203.jpg)
Example: Rationality of housing valuations
Unrestricted model
y = β0 + β1x1 + β2x2 + β3x3 + β4x4 + u
Restricted model under H0 : β1 = 1, β2 = 0, β3 = 0, β4 = 0
y = β0 + x1 + u
Restricted model can be estimated using
y − x1 = β0 + u
The steps of the F test are the same.
Econometrics I: Multiple Regression: Inference - H. Tastan 53
![Page 204: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/204.jpg)
Example: Rationality of housing valuations
![Page 205: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/205.jpg)
Example: Rationality of housing valuations
Test statistic:
F =(1.880− 1.822)
1.822
83
4= 0.661
The critical value at the 5% significance level for F(4,83)distribution: c = 2.5
We fail to reject H0.
There is no evidence against the null hypothesis that thehousing evaluations are rational.
Econometrics I: Multiple Regression: Inference - H. Tastan 55
![Page 206: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/206.jpg)
Example: Rationality of housing valuations
Test statistic:
F =(1.880− 1.822)
1.822
83
4= 0.661
The critical value at the 5% significance level for F(4,83)distribution: c = 2.5
We fail to reject H0.
There is no evidence against the null hypothesis that thehousing evaluations are rational.
Econometrics I: Multiple Regression: Inference - H. Tastan 55
![Page 207: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/207.jpg)
Example: Rationality of housing valuations
Test statistic:
F =(1.880− 1.822)
1.822
83
4= 0.661
The critical value at the 5% significance level for F(4,83)distribution: c = 2.5
We fail to reject H0.
There is no evidence against the null hypothesis that thehousing evaluations are rational.
Econometrics I: Multiple Regression: Inference - H. Tastan 55
![Page 208: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/208.jpg)
Example: Rationality of housing valuations
Test statistic:
F =(1.880− 1.822)
1.822
83
4= 0.661
The critical value at the 5% significance level for F(4,83)distribution: c = 2.5
We fail to reject H0.
There is no evidence against the null hypothesis that thehousing evaluations are rational.
Econometrics I: Multiple Regression: Inference - H. Tastan 55
![Page 209: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/209.jpg)
Reporting Regression Results
The estimated OLS coefficients should always be reported.The key coefficient estimates should be interpreted taking intoaccount the functional forms and units of measurement.
Individual t statistics and F statistic for the overallsignificance of the regression should also be reported.
Standard errors for the coefficient estimates can be givenalong with the estimates. This allows us to conduct t tests forthe values other than zero and to compute confidenceintervals.
R2 and n should always be reported. One may also considerreporting SSR and the standard error of the regression (σ).
Econometrics I: Multiple Regression: Inference - H. Tastan 56
![Page 210: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/210.jpg)
Reporting Regression Results
The estimated OLS coefficients should always be reported.The key coefficient estimates should be interpreted taking intoaccount the functional forms and units of measurement.
Individual t statistics and F statistic for the overallsignificance of the regression should also be reported.
Standard errors for the coefficient estimates can be givenalong with the estimates. This allows us to conduct t tests forthe values other than zero and to compute confidenceintervals.
R2 and n should always be reported. One may also considerreporting SSR and the standard error of the regression (σ).
Econometrics I: Multiple Regression: Inference - H. Tastan 56
![Page 211: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/211.jpg)
Reporting Regression Results
The estimated OLS coefficients should always be reported.The key coefficient estimates should be interpreted taking intoaccount the functional forms and units of measurement.
Individual t statistics and F statistic for the overallsignificance of the regression should also be reported.
Standard errors for the coefficient estimates can be givenalong with the estimates. This allows us to conduct t tests forthe values other than zero and to compute confidenceintervals.
R2 and n should always be reported. One may also considerreporting SSR and the standard error of the regression (σ).
Econometrics I: Multiple Regression: Inference - H. Tastan 56
![Page 212: MULTIPLE REGRESSION ANALYSIS: INFERENCEyildiz.edu.tr/~tastan/teaching/04 Multiple Regression Inference.pdf · MULTIPLE REGRESSION ANALYSIS: INFERENCE Huseyin Ta˘stan1 1Y ld z Technical](https://reader031.vdocuments.site/reader031/viewer/2022021821/5af5d4067f8b9a74448eb484/html5/thumbnails/212.jpg)
Reporting Regression Results
The estimated OLS coefficients should always be reported.The key coefficient estimates should be interpreted taking intoaccount the functional forms and units of measurement.
Individual t statistics and F statistic for the overallsignificance of the regression should also be reported.
Standard errors for the coefficient estimates can be givenalong with the estimates. This allows us to conduct t tests forthe values other than zero and to compute confidenceintervals.
R2 and n should always be reported. One may also considerreporting SSR and the standard error of the regression (σ).
Econometrics I: Multiple Regression: Inference - H. Tastan 56