伍德里奇计量经济学讲义8
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
2
Interpreting Multiple Regression
yˆ bˆ0 bˆ1x1 bˆ2 x2 ... bˆk xk , so yˆ bˆ1x1 bˆ2 x2 ... bˆk xk ,
so holding x2,...,xk fixed implies that
yˆ bˆ1x1, that is each b has
u is still the error term (or disturbance) Still need to make a zero conditional mean assumption, so now assume that E(u|x1,x2, …,xk) = 0 Still minimizing the sum of squared residuals, so have k+1 first order conditions
Multiple Regression Analysis
y = b0 + b1x1 + b2x2 + . . . bkxk + u
1. Estimation
1
Parallels with Simple Regression
b0 is still the intercept b1 to bk all called slope parameters
5
Simple vs Multiple Reg Estimate
Compare thesimple regression ~y b~0 b~1x1 with themultiple regression yˆ bˆ0 bˆ1x1 bˆ2x2 Generally, b~1 bˆ1 unless : bˆ2 0 (i.e. no partial effectof x2 ) OR
R2
yi y yˆi yˆ 2 yi y2 yˆi yˆ 2
9
More about R-squared
R2 can never decrease when another independent variable is added to a regression, and usually will increase
Because R2 will usually increase with the number of independent variables, it is not a good way to compare models
10
Assumptions for Unbiasedness
Population model is linear in parameters:
x1 and x2 are uncorrelated in the sample
6
Goodness-of-Fit
We can thinkof each observation as being made up of an explained part, and an unexplained part, yi yˆi uˆi Wethen define the following :
How do we think about how well our sample regression line fits our sample data?
Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression
the residuals from the estimated
regression xˆ1 ˆ0 ˆ2 xˆ2
4
“Partialling Out” continued
Previous equation implies that regressing y on x1 and x2 gives same effect of x1 as regressing y on residuals from a regression of x1 on x2 This means only the part of xi1 that is uncorrelated with xi2 are being related to yi so we’re estimating the effect of x1 on y after x2 has been “partialled out”
yi y2 is the totalsum of squares(SST) yˆi y2 is the explained sum of squares(SSE) uˆi2 is the residualsum of squares(SSR)
Then SST SSE SSR
7
Goodness-of-Fit (continued)
y = b0 + b1x1 + b2x2 +…+ bkxk + u
We can use a random sample of size n, {(xi1, xi2,…, xik, yi): i=1, 2, …, n}, from the population model, so that the sample model
a cபைடு நூலகம்teris paribus interpretation
3
A “Partialling Out” Interpretation
Consider the case wherek 2, i.e.
yˆ bˆ0 bˆ1x1 bˆ2 x2 , then
bˆ1 rˆi1yi
rˆi12 , w hererˆi1 are
R2 = SSE/SST = 1 – SSR/SST
8
Goodness-of-Fit (continued)
We can also think of R2 as being equal to
the squaredcorrelation coefficient between
the actual yi and the values yˆi
Interpreting Multiple Regression
yˆ bˆ0 bˆ1x1 bˆ2 x2 ... bˆk xk , so yˆ bˆ1x1 bˆ2 x2 ... bˆk xk ,
so holding x2,...,xk fixed implies that
yˆ bˆ1x1, that is each b has
u is still the error term (or disturbance) Still need to make a zero conditional mean assumption, so now assume that E(u|x1,x2, …,xk) = 0 Still minimizing the sum of squared residuals, so have k+1 first order conditions
Multiple Regression Analysis
y = b0 + b1x1 + b2x2 + . . . bkxk + u
1. Estimation
1
Parallels with Simple Regression
b0 is still the intercept b1 to bk all called slope parameters
5
Simple vs Multiple Reg Estimate
Compare thesimple regression ~y b~0 b~1x1 with themultiple regression yˆ bˆ0 bˆ1x1 bˆ2x2 Generally, b~1 bˆ1 unless : bˆ2 0 (i.e. no partial effectof x2 ) OR
R2
yi y yˆi yˆ 2 yi y2 yˆi yˆ 2
9
More about R-squared
R2 can never decrease when another independent variable is added to a regression, and usually will increase
Because R2 will usually increase with the number of independent variables, it is not a good way to compare models
10
Assumptions for Unbiasedness
Population model is linear in parameters:
x1 and x2 are uncorrelated in the sample
6
Goodness-of-Fit
We can thinkof each observation as being made up of an explained part, and an unexplained part, yi yˆi uˆi Wethen define the following :
How do we think about how well our sample regression line fits our sample data?
Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression
the residuals from the estimated
regression xˆ1 ˆ0 ˆ2 xˆ2
4
“Partialling Out” continued
Previous equation implies that regressing y on x1 and x2 gives same effect of x1 as regressing y on residuals from a regression of x1 on x2 This means only the part of xi1 that is uncorrelated with xi2 are being related to yi so we’re estimating the effect of x1 on y after x2 has been “partialled out”
yi y2 is the totalsum of squares(SST) yˆi y2 is the explained sum of squares(SSE) uˆi2 is the residualsum of squares(SSR)
Then SST SSE SSR
7
Goodness-of-Fit (continued)
y = b0 + b1x1 + b2x2 +…+ bkxk + u
We can use a random sample of size n, {(xi1, xi2,…, xik, yi): i=1, 2, …, n}, from the population model, so that the sample model
a cபைடு நூலகம்teris paribus interpretation
3
A “Partialling Out” Interpretation
Consider the case wherek 2, i.e.
yˆ bˆ0 bˆ1x1 bˆ2 x2 , then
bˆ1 rˆi1yi
rˆi12 , w hererˆi1 are
R2 = SSE/SST = 1 – SSR/SST
8
Goodness-of-Fit (continued)
We can also think of R2 as being equal to
the squaredcorrelation coefficient between
the actual yi and the values yˆi