ch03

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。


~
x x x x x
i1 1 i2 2 i1 1
Economics 20 - Prof. Anderson
15
Omitted Variable Bias (cont)
Consider t he regressionof x2 on x1 ~ x2 0 1 x1 then 1 so E b1 b1 b 2 1
2
ˆ y ˆ y y y y y yˆ yˆ
2 i i 2 2 i i
Economics 20 - Prof. Anderson
9
More about R-squared
R2 can never decrease when another independent variable is added to a regression, and usually will increase
2 2
x1 and x2 are uncorrelat ed in thesample
Economics 20 - Prof. Anderson 6
Goodness-of-Fit
We can thinkof each observatio n as being made up of an explainedpart,and an unexplaine d part, ˆi u ˆi Wethendefine thefollowing: yi y
Economics 20 - Prof. Anderson 7
Goodness-of-Fit (continued)
How do we think about how well our sample regression line fits our sample data?
Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression R2 = SSE/SST = 1 – SSR/SST
1 1 2 2 k k
so holding x2 ,..., xk fixed implies that ˆ x , thatis each b has ˆ b y
1 1
a ceteris paribus interpreta tion
Economics 20 - Prof. Anderson 3
Economics 20 - Prof. Anderson 2
Interpreting Multiple Regression
ˆ b ˆ x b ˆ x ... b ˆ x , so ˆb y 0 1 1 2 2 k k ˆ x b ˆ x ... b ˆ x , ˆ b y
y y is the totalsum of squares (SST ) ˆ y is theexplainedsum of squares (SSE) y ˆ is theresidual sum of squares (SSR) u
2 2 i i 2 i
T henSST SSE SSR
Economics 20 - Prof. Anderson
17
Omitted Variable Bias Summary
Two cases where bias is equal to zero

b2 = 0, that is x2 doesn’t really belong in model
x x b b x x
i1 1 1 i1 1
0 2
b1 xi1 b 2 xi 2 ui b 2 xi1 x1 xi 2 xi1 x1 ui
Economics 20 - Prof. Anderson
14
Omitted Variable Bias (cont)
Economics 20 - Prof. Anderson 5
Simple vs Multiple Reg Estimate
~ ~ ~ Comparethesimple regression y b 0 b1 x1 ˆ b ˆ x b ˆ x ˆb wit h themultipleregression y 0 1 1 2 2 ~ ˆ unless : Generally,b1 b 1 ˆ 0 (i.e.no partialeffectof x ) OR b
Economics 20 - Prof. Anderson 18
x1 and x2 are uncorrelated in the sample
If correlation between x2 , x1 and x2 , y is the same direction, bias will be positive If correlation between x2 , x1 and x2 , y is the opposite direction, bias will be negative
b1
~
x x y x x
i1 1 2 i1 1
0
1 1
i
Economics 20 - Prof. Anderson
13
Omitted Variable Bias (cont)
Recall the true model,so that yi b 0 b1 xi1 b 2 xi 2 ui , so the numeratorbecomes
Economics 20 - Prof. Anderson 12
Omitted Variable Bias
Suppose the true modelis given as y b 0 b1 x1 b 2 x2 u, but we ~ ~ ~ estimatey b b x u, then
Economics 20 - Prof. Anderson 16
~
~
~

~
~
x x x x x
i1 1 i2 2 i1 1
Summary of Direction of Bias
Corr(x1, x2) > 0 Corr(x1, x2) < 0
b2 > 0 b2 < 0 Positive bias Negative bias Negative bias Positive bias
u is still the error term (or disturbance) Still need to make a zero conditional mean assumption, so now assume that E(u|x1,x2, …,xk) = 0 Still minimizing the sum of squared residuals, so have k+1 first order conditions
Population model is linear in parameters: y = b0 + b1x1 + b2x2 +…+ bkxk + u We can use a random sample of size n, {(xi1, xi2,…, xik, yi): i=1, 2, …, n}, from the population model, so that the sample model is yi = b0 + b1xi1 + b2xi2 +…+ bkxik + ui E(u|x1, x2,… xk) = 0, implying that all of the explanatory variables are exogenous None of the x’s is constant, and there are no exact linear relationships among them
Economics 20 - Prof. Anderson 8
Goodness-of-Fit (continued)
We can also thinkof R 2 as being equal to thesquared correlatio n coefficien t between ˆi theactual yi and the values y R
Because R2 will usually increase with the number of independent variables, it is not a good way to compare models
Economics 20 - Prof. Anderson 10
Assumptions for Unbiasedness
Multiple Regression Analysis
y = b0 + b1x1 + b2x2 + . . . bkxk + u 1. Estimation
Economics 20 - Prof. Anderson
1
Parallels with Simple Regression
b0 is still the intercept b1 to bk all called slope parameters
b b1 b 2
~
x x x x x x x
i1 1 i2 2 i1 1
i1 i1 Βιβλιοθήκη 1 ui x1 2
since E(ui ) 0, t akingexpect at io ns we have E b1 b1 b 2
A “Partialling Out” Interpretation
Consider the case where k 2, i.e. ˆ b ˆ x b ˆ x , then ˆb y ˆ r b ˆi1 yi 1
0 1 1 2 2 2 i1

ˆi1 are , where r
the residuals from theestimated ˆ1 ˆ0 ˆ2 x ˆ2 regression x
Economics 20 - Prof. Anderson 4
“Partialling Out” continued
Previous equation implies that regressing y on x1 and x2 gives same effect of x1 as regressing y on residuals from a regression of x1 on x2 This means only the part of xi1 that is uncorrelated with xi2 are being related to yi so we’re estimating the effect of x1 on y after x2 has been “partialled out”
Economics 20 - Prof. Anderson 11
Too Many or Too Few Variables
What happens if we include variables in our specification that don’t belong? There is no effect on our parameter estimate, and OLS remains unbiased What if we exclude a variable from our specification that does belong? OLS will usually be biased
相关文档
最新文档