Logo

AskSia

Plus

Question 1 (a) Consider the model $y_{i}=x_{i 1}^{\prime} \beta_{1}+x_{i 2}^{\p...
Jul 10, 2024
Solution by Steps
step 1
Define the null and alternative hypotheses: H0:β2=0H_0: \beta_2 = 0 H1:β20H_1: \beta_2 \neq 0
step 2
Fit the model yi=xi1β1+xi2β2+eiy_i = x_{i1}'\beta_1 + x_{i2}\beta_2 + e_i using the original sample data to obtain the estimates β^1\hat{\beta}_1 and β^2\hat{\beta}_2
step 3
Compute the residuals ei=yixi1β^1xi2β^2e_i = y_i - x_{i1}'\hat{\beta}_1 - x_{i2}\hat{\beta}_2
step 4
Resample the residuals eie_i^* with replacement to create a new set of residuals
step 5
Generate a new bootstrap sample yi=xi1β^1+xi2β^2+eiy_i^* = x_{i1}'\hat{\beta}_1 + x_{i2}\hat{\beta}_2 + e_i^*
step 6
Refit the model using the bootstrap sample to obtain new estimates β^1\hat{\beta}_1^* and β^2\hat{\beta}_2^*
step 7
Repeat steps 4-6 many times (e.g., 1000 times) to create a distribution of β^2\hat{\beta}_2^*
step 8
Calculate the bootstrap p-value as the proportion of bootstrap samples where β^2\hat{\beta}_2^* is as extreme or more extreme than the observed β^2\hat{\beta}_2
step 9
Compare the bootstrap p-value to the significance level α\alpha to decide whether to reject H0H_0
Answer
The null hypothesis H0:β2=0H_0: \beta_2 = 0 is tested using the nonparametric bootstrap by resampling residuals, generating bootstrap samples, refitting the model, and calculating the bootstrap p-value.
Question 1(b)
step 1
Define the null and alternative hypotheses: H0:β1=β2H_0: \beta_1 = \beta_2 H1:β1β2H_1: \beta_1 \neq \beta_2
step 2
Fit the model yi=xi1β1+xi2β2+eiy_i = x_{i1}'\beta_1 + x_{i2}'\beta_2 + e_i using the original sample data to obtain the estimates β^1\hat{\beta}_1 and β^2\hat{\beta}_2
step 3
Compute the residuals ei=yixi1β^1xi2β^2e_i = y_i - x_{i1}'\hat{\beta}_1 - x_{i2}'\hat{\beta}_2
step 4
Resample the residuals eie_i^* with replacement to create a new set of residuals
step 5
Generate a new bootstrap sample yi=xi1β^1+xi2β^2+eiy_i^* = x_{i1}'\hat{\beta}_1 + x_{i2}'\hat{\beta}_2 + e_i^*
step 6
Refit the model using the bootstrap sample to obtain new estimates β^1\hat{\beta}_1^* and β^2\hat{\beta}_2^*
step 7
Calculate the test statistic for each bootstrap sample, e.g., T=β^1β^2T^* = \|\hat{\beta}_1^* - \hat{\beta}_2^*\|
step 8
Repeat steps 4-7 many times (e.g., 1000 times) to create a distribution of the test statistic TT^*
step 9
Calculate the bootstrap p-value as the proportion of bootstrap samples where TT^* is as extreme or more extreme than the observed test statistic T=β^1β^2T = \|\hat{\beta}_1 - \hat{\beta}_2\|
step 10
Compare the bootstrap p-value to the significance level α\alpha to decide whether to reject H0H_0
Answer
The null hypothesis H0:β1=β2H_0: \beta_1 = \beta_2 is tested using the nonparametric bootstrap by resampling residuals, generating bootstrap samples, calculating the test statistic for each sample, and determining the bootstrap p-value.
Key Concept
Nonparametric bootstrap is a resampling method used to estimate the distribution of a statistic by resampling with replacement from the observed data.
Explanation
The nonparametric bootstrap involves resampling residuals, generating new samples, refitting the model, and calculating the distribution of the test statistic to determine the p-value for hypothesis testing.
Solution by Steps
step 1
Consider the model yi=xiβ+eiy_{i}=x_{i} \beta+e_{i}, where xix_{i} is correlated with eie_{i}. This correlation implies that the OLS estimator will be biased and inconsistent
step 2
The OLS estimator for β\beta is given by: β^OLS=(i=1nxixi)1i=1nxiyi\hat{\beta}_{OLS} = \left( \sum_{i=1}^{n} x_{i} x_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} x_{i} y_{i} Substitute yi=xiβ+eiy_{i} = x_{i} \beta + e_{i}: β^OLS=(i=1nxixi)1i=1nxi(xiβ+ei)\hat{\beta}_{OLS} = \left( \sum_{i=1}^{n} x_{i} x_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} x_{i} (x_{i} \beta + e_{i}) β^OLS=β+(i=1nxixi)1i=1nxiei\hat{\beta}_{OLS} = \beta + \left( \sum_{i=1}^{n} x_{i} x_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} x_{i} e_{i}
step 3
Since xix_{i} is correlated with eie_{i}, i=1nxiei0\sum_{i=1}^{n} x_{i} e_{i} \neq 0 even as nn \rightarrow \infty. Therefore, the term (i=1nxixi)1i=1nxiei\left( \sum_{i=1}^{n} x_{i} x_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} x_{i} e_{i} does not vanish, leading to inconsistency
step 4
The probability limit of β^OLS\hat{\beta}_{OLS} is: plimβ^OLS=β+plim(i=1nxixi)1i=1nxiei\text{plim} \, \hat{\beta}_{OLS} = \beta + \text{plim} \left( \sum_{i=1}^{n} x_{i} x_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} x_{i} e_{i} Since i=1nxiei0\sum_{i=1}^{n} x_{i} e_{i} \neq 0, the probability limit is not equal to β\beta
Answer
The OLS estimator is inconsistent, and its probability limit is β+plim(i=1nxixi)1i=1nxiei\beta + \text{plim} \left( \sum_{i=1}^{n} x_{i} x_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} x_{i} e_{i}.
Part (b)
step 1
Consider the term 1ni=1nγziui\frac{1}{n} \sum_{i=1}^{n} \gamma^{\prime} z_{i} u_{i}. We need to show that it converges to zero in probability
step 2
Calculate the mean: E[1ni=1nγziui]=1ni=1nγE[ziui]=0\mathbb{E}\left[\frac{1}{n} \sum_{i=1}^{n} \gamma^{\prime} z_{i} u_{i}\right] = \frac{1}{n} \sum_{i=1}^{n} \gamma^{\prime} \mathbb{E}[z_{i} u_{i}] = 0 since E[ui]=0\mathbb{E}[u_{i}] = 0 and ziz_{i} are instruments
step 3
Calculate the variance: Var(1ni=1nγziui)=1n2i=1nVar(γziui)=1n2i=1nγVar(ziui)γ\text{Var}\left(\frac{1}{n} \sum_{i=1}^{n} \gamma^{\prime} z_{i} u_{i}\right) = \frac{1}{n^2} \sum_{i=1}^{n} \text{Var}(\gamma^{\prime} z_{i} u_{i}) = \frac{1}{n^2} \sum_{i=1}^{n} \gamma^{\prime} \text{Var}(z_{i} u_{i}) \gamma =1n2i=1nγγ=1nγγ= \frac{1}{n^2} \sum_{i=1}^{n} \gamma^{\prime} \gamma = \frac{1}{n} \gamma^{\prime} \gamma As nn \rightarrow \infty, 1nγγ0\frac{1}{n} \gamma^{\prime} \gamma \rightarrow 0
step 4
By Chebyshev's inequality, 1ni=1nγziuip0\frac{1}{n} \sum_{i=1}^{n} \gamma^{\prime} z_{i} u_{i} \xrightarrow{p} 0
Answer
1ni=1nγziui\frac{1}{n} \sum_{i=1}^{n} \gamma^{\prime} z_{i} u_{i} converges to zero in probability.
Part (c)
step 1
Consider the 2SLS estimator: β^2SLS=(i=1nx^ix^i)1i=1nx^iyi\hat{\beta}_{2SLS} = \left( \sum_{i=1}^{n} \hat{x}_{i} \hat{x}_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} \hat{x}_{i} y_{i} where x^i\hat{x}_{i} are the fitted values from the first stage regression xi=γ1zi1+γ2zi2++γkzik+uix_{i} = \gamma_{1} z_{i1} + \gamma_{2} z_{i2} + \cdots + \gamma_{k} z_{ik} + u_{i}
step 2
Substitute yi=xiβ+eiy_{i} = x_{i} \beta + e_{i}: β^2SLS=(i=1nx^ix^i)1i=1nx^i(xiβ+ei)\hat{\beta}_{2SLS} = \left( \sum_{i=1}^{n} \hat{x}_{i} \hat{x}_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} \hat{x}_{i} (x_{i} \beta + e_{i}) β^2SLS=β+(i=1nx^ix^i)1i=1nx^iei\hat{\beta}_{2SLS} = \beta + \left( \sum_{i=1}^{n} \hat{x}_{i} \hat{x}_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} \hat{x}_{i} e_{i}
step 3
Using the hint: (1ni=1nuizi)(1ni=1nzizi)1(1ni=1nziei)pdcov(ui,ei)\left( \frac{1}{n} \sum_{i=1}^{n} u_{i} z_{i}^{\prime} \right) \left( \frac{1}{n} \sum_{i=1}^{n} z_{i} z_{i}^{\prime} \right)^{-1} \left( \frac{1}{n} \sum_{i=1}^{n} z_{i} e_{i} \right) \xrightarrow{p} d \operatorname{cov}(u_{i}, e_{i}) Since cov(ui,ei)0\operatorname{cov}(u_{i}, e_{i}) \neq 0, the term (i=1nx^ix^i)1i=1nx^iei\left( \sum_{i=1}^{n} \hat{x}_{i} \hat{x}_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} \hat{x}_{i} e_{i} does not vanish, leading to inconsistency
step 4
The probability limit of β^2SLS\hat{\beta}_{2SLS} is: plimβ^2SLS=β+plim(i=1nx^ix^i)1i=1nx^iei\text{plim} \, \hat{\beta}_{2SLS} = \beta + \text{plim} \left( \sum_{i=1}^{n} \hat{x}_{i} \hat{x}_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} \hat{x}_{i} e_{i} Since i=1nx^iei0\sum_{i=1}^{n} \hat{x}_{i} e_{i} \neq 0, the probability limit is not equal to β\beta
Answer
The 2SLS estimator is inconsistent, and its probability limit is β+plim(i=1nx^ix^i)1i=1nx^iei\beta + \text{plim} \left( \sum_{i=1}^{n} \hat{x}_{i} \hat{x}_{i}^{\prime} \right)^{-1} \sum_{i=1}^{n} \hat{x}_{i} e_{i}.
Key Concept
Inconsistency of OLS and 2SLS estimators in the presence of endogeneity.
Explanation
Both OLS and 2SLS estimators are inconsistent when the regressors are correlated with the error term, leading to biased estimates.
© 2023 AskSia.AI all rights reserved