伍德里奇计量经济学讲义7
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
To perform our test we first need to form
"the"t statistic for bˆj :tbˆ j bˆ j se bˆ j
We will then use our t statistic along with a rejection rule to determine whether to accept the null hypothesis, H0
Note this is a t distributi on (vs normal)
because we have to estimate s 2by sˆ 2
Note the degrees of freedom : n k 1
The t Test (cont)
• Knowing the sampling distribution for the standardized estimator allows us to carry out hypothesis tests
t Test: One-Sided Alternatives
• Besides our null, H0, wBaidu Nhomakorabea need an alternative hypothesis, H1, and a significance level
• H1 may be one-sided, or two-sided
• We can summarize the population assumptions of CLM as follows
• y|x ~ Normal(b0 + b1x1 +…+ bkxk, s2)
• While for now we just assume normality, clear that sometimes not the case
• Assume that u is independent of x1, x2,…, xk and u is normally distributed with zero
mean and variance s2: u ~ Normal(0,s2)
CLM Assumptions (cont)
• Under CLM, OLS is not only BLUE, but is the minimum variance unbiased estimator
• We can reject the null hypothesis if the t statistic is greater than the critical value
• If the t statistic is less than the critical value then we fail to reject the null
Assumptions of the Classical Linear Model (CLM)
• So far, we know that given the GaussMarkov assumptions, OLS is BLUE,
• In order to do classical hypothesis testing, we need to add another assumption (beyond the Gauss-Markov assumptions)
bˆ j b j sd bˆ j ~ Normal0,1
bˆj is distributed normally because it
is a linear combination of the errors
The t Test
Under the CLM assumption s
bˆ j b j se bˆ j ~ tnk1
One-Sided Alternatives (cont)
yi = b0 + b1xi1 + … + bkxik + ui
One-Sided Alternatives (cont)
• Having picked a significance level, a, we look up the (1 – a)th percentile in a t distribution with n – k – 1 df and call this c, the critical value
• H1: bj > 0 and H1: bj < 0 are one-sided • H1: bj 0 is a two-sided alternative
• If we want to have only a 5% probability of rejecting H0 if it is really true, then we say our significance level is 5%
• Start with a null hypothesis
• For example, H0: bj=0
• If accept null, then accept that xj has no effect on y, controlling for other x’s
The t Test (cont)
• Large samples will let us drop normality
The homoskedastic normal distribution with a single explanatory variable
y
f(y|x)
.
.
E(y|x) = b0 + b1x
Normal distributions
x1
x2
Normal Sampling Distributions
Under the CLM assumptions, conditional on the sample values of the independent variables
bˆ j ~ Normal b j ,Var bˆ j , so that
"the"t statistic for bˆj :tbˆ j bˆ j se bˆ j
We will then use our t statistic along with a rejection rule to determine whether to accept the null hypothesis, H0
Note this is a t distributi on (vs normal)
because we have to estimate s 2by sˆ 2
Note the degrees of freedom : n k 1
The t Test (cont)
• Knowing the sampling distribution for the standardized estimator allows us to carry out hypothesis tests
t Test: One-Sided Alternatives
• Besides our null, H0, wBaidu Nhomakorabea need an alternative hypothesis, H1, and a significance level
• H1 may be one-sided, or two-sided
• We can summarize the population assumptions of CLM as follows
• y|x ~ Normal(b0 + b1x1 +…+ bkxk, s2)
• While for now we just assume normality, clear that sometimes not the case
• Assume that u is independent of x1, x2,…, xk and u is normally distributed with zero
mean and variance s2: u ~ Normal(0,s2)
CLM Assumptions (cont)
• Under CLM, OLS is not only BLUE, but is the minimum variance unbiased estimator
• We can reject the null hypothesis if the t statistic is greater than the critical value
• If the t statistic is less than the critical value then we fail to reject the null
Assumptions of the Classical Linear Model (CLM)
• So far, we know that given the GaussMarkov assumptions, OLS is BLUE,
• In order to do classical hypothesis testing, we need to add another assumption (beyond the Gauss-Markov assumptions)
bˆ j b j sd bˆ j ~ Normal0,1
bˆj is distributed normally because it
is a linear combination of the errors
The t Test
Under the CLM assumption s
bˆ j b j se bˆ j ~ tnk1
One-Sided Alternatives (cont)
yi = b0 + b1xi1 + … + bkxik + ui
One-Sided Alternatives (cont)
• Having picked a significance level, a, we look up the (1 – a)th percentile in a t distribution with n – k – 1 df and call this c, the critical value
• H1: bj > 0 and H1: bj < 0 are one-sided • H1: bj 0 is a two-sided alternative
• If we want to have only a 5% probability of rejecting H0 if it is really true, then we say our significance level is 5%
• Start with a null hypothesis
• For example, H0: bj=0
• If accept null, then accept that xj has no effect on y, controlling for other x’s
The t Test (cont)
• Large samples will let us drop normality
The homoskedastic normal distribution with a single explanatory variable
y
f(y|x)
.
.
E(y|x) = b0 + b1x
Normal distributions
x1
x2
Normal Sampling Distributions
Under the CLM assumptions, conditional on the sample values of the independent variables
bˆ j ~ Normal b j ,Var bˆ j , so that