AskSia
Plus
对一元线性回归分析,记回归残差为
e
1
,
e
2
,
⋯
,
e
n
e_1,e_2,\cdots ,e_n
e
1
,
e
2
,
⋯
,
e
n
(
e
i
=
y
i
−
y
^
i
i
=
1
,
2
,
⋯
,
n
e_i=y_i-\hat{y}_i\,\,i=1,2,\cdots ,n
e
i
=
y
i
−
y
^
i
i
=
1
,
2
,
⋯
,
n
)。 ...
Nov 29, 2024
对一元线性回归分析,记回归残差为
e
1
,
e
2
,
⋯
,
e
n
e_1,e_2,\cdots ,e_n
e
1
,
e
2
,
⋯
,
e
n
(
e
i
=
y
i
−
y
^
i
i
=
1
,
2
,
⋯
,
n
e_i=y_i-\hat{y}_i\,\,i=1,2,\cdots ,n
e
i
=
y
i
−
y
^
i
i
=
1
,
2
,
⋯
,
n
)。 (1)证明
Σ
n
i
=
1
e
i
=
0
\underset{i=1}{\overset{n}{\varSigma}}e_i=0
i
=
1
Σ
n
e
i
=
0
,
Σ
n
i
=
1
x
i
e
i
=
0
\underset{i=1}{\overset{n}{\varSigma}}x_ie_i=0
i
=
1
Σ
n
x
i
e
i
=
0
。 (2)证明
y
^
i
−
y
‾
=
β
^
1
(
x
i
−
x
‾
)
\hat{y}_i-\overline{y}=\hat{\beta}_1\left( x_i-\overline{x} \right)
y
^
i
−
y
=
β
^
1
(
x
i
−
x
)
,并基于此写出由
β
^
1
\hat{\beta}_1
β
^
1
计算回归离差平方和
S
S
R
SSR
SSR
计算式。
Solution by Steps
step 1
证明
∑
i
=
1
n
e
i
=
0
\sum_{i=1}^{n} e_i = 0
∑
i
=
1
n
e
i
=
0
:根据回归模型的性质,残差
e
i
=
y
i
−
y
^
i
e_i = y_i - \hat{y}_i
e
i
=
y
i
−
y
^
i
,因此有:
∑
i
=
1
n
e
i
=
∑
i
=
1
n
(
y
i
−
y
^
i
)
=
∑
i
=
1
n
y
i
−
∑
i
=
1
n
y
^
i
\sum_{i=1}^{n} e_i = \sum_{i=1}^{n} (y_i - \hat{y}_i) = \sum_{i=1}^{n} y_i - \sum_{i=1}^{n} \hat{y}_i
i
=
1
∑
n
e
i
=
i
=
1
∑
n
(
y
i
−
y
^
i
)
=
i
=
1
∑
n
y
i
−
i
=
1
∑
n
y
^
i
由于
y
^
i
\hat{y}_i
y
^
i
是通过最小二乘法得到的,故
∑
i
=
1
n
y
^
i
=
∑
i
=
1
n
y
i
\sum_{i=1}^{n} \hat{y}_i = \sum_{i=1}^{n} y_i
∑
i
=
1
n
y
^
i
=
∑
i
=
1
n
y
i
,从而得出
∑
i
=
1
n
e
i
=
0
\sum_{i=1}^{n} e_i = 0
∑
i
=
1
n
e
i
=
0
。
step 2
证明
∑
i
=
1
n
x
i
e
i
=
0
\sum_{i=1}^{n} x_i e_i = 0
∑
i
=
1
n
x
i
e
i
=
0
:我们知道
y
^
i
=
β
^
0
+
β
^
1
x
i
\hat{y}_i = \hat{\beta}_0 + \hat{\beta}_1 x_i
y
^
i
=
β
^
0
+
β
^
1
x
i
,因此有:
∑
i
=
1
n
x
i
e
i
=
∑
i
=
1
n
x
i
(
y
i
−
y
^
i
)
=
∑
i
=
1
n
x
i
y
i
−
∑
i
=
1
n
x
i
y
^
i
\sum_{i=1}^{n} x_i e_i = \sum_{i=1}^{n} x_i (y_i - \hat{y}_i) = \sum_{i=1}^{n} x_i y_i - \sum_{i=1}^{n} x_i \hat{y}_i
i
=
1
∑
n
x
i
e
i
=
i
=
1
∑
n
x
i
(
y
i
−
y
^
i
)
=
i
=
1
∑
n
x
i
y
i
−
i
=
1
∑
n
x
i
y
^
i
由于
y
^
i
\hat{y}_i
y
^
i
是通过最小二乘法得到的,故
∑
i
=
1
n
x
i
y
^
i
=
β
^
0
∑
i
=
1
n
x
i
+
β
^
1
∑
i
=
1
n
x
i
2
\sum_{i=1}^{n} x_i \hat{y}_i = \hat{\beta}_0 \sum_{i=1}^{n} x_i + \hat{\beta}_1 \sum_{i=1}^{n} x_i^2
∑
i
=
1
n
x
i
y
^
i
=
β
^
0
∑
i
=
1
n
x
i
+
β
^
1
∑
i
=
1
n
x
i
2
,而
∑
i
=
1
n
x
i
e
i
=
0
\sum_{i=1}^{n} x_i e_i = 0
∑
i
=
1
n
x
i
e
i
=
0
。
step 3
证明
y
^
i
−
y
‾
=
β
^
1
(
x
i
−
x
‾
)
\hat{y}_i - \overline{y} = \hat{\beta}_1 (x_i - \overline{x})
y
^
i
−
y
=
β
^
1
(
x
i
−
x
)
:根据回归模型的定义,
y
^
i
=
β
^
0
+
β
^
1
x
i
\hat{y}_i = \hat{\beta}_0 + \hat{\beta}_1 x_i
y
^
i
=
β
^
0
+
β
^
1
x
i
,而
y
‾
=
β
^
0
+
β
^
1
x
‾
\overline{y} = \hat{\beta}_0 + \hat{\beta}_1 \overline{x}
y
=
β
^
0
+
β
^
1
x
,因此有:
y
^
i
−
y
‾
=
β
^
1
x
i
−
β
^
1
x
‾
=
β
^
1
(
x
i
−
x
‾
)
\hat{y}_i - \overline{y} = \hat{\beta}_1 x_i - \hat{\beta}_1 \overline{x} = \hat{\beta}_1 (x_i - \overline{x})
y
^
i
−
y
=
β
^
1
x
i
−
β
^
1
x
=
β
^
1
(
x
i
−
x
)
。
step 4
由
β
^
1
\hat{\beta}_1
β
^
1
计算回归离差平方和
S
S
R
SSR
SSR
的计算式为:
S
S
R
=
∑
i
=
1
n
(
y
^
i
−
y
‾
)
2
=
β
^
1
2
∑
i
=
1
n
(
x
i
−
x
‾
)
2
SSR = \sum_{i=1}^{n} (\hat{y}_i - \overline{y})^2 = \hat{\beta}_1^2 \sum_{i=1}^{n} (x_i - \overline{x})^2
SSR
=
i
=
1
∑
n
(
y
^
i
−
y
)
2
=
β
^
1
2
i
=
1
∑
n
(
x
i
−
x
)
2
。
Answer
1. 证明
∑
i
=
1
n
e
i
=
0
\sum_{i=1}^{n} e_i = 0
∑
i
=
1
n
e
i
=
0
和
∑
i
=
1
n
x
i
e
i
=
0
\sum_{i=1}^{n} x_i e_i = 0
∑
i
=
1
n
x
i
e
i
=
0
;2. 证明
y
^
i
−
y
‾
=
β
^
1
(
x
i
−
x
‾
)
\hat{y}_i - \overline{y} = \hat{\beta}_1 (x_i - \overline{x})
y
^
i
−
y
=
β
^
1
(
x
i
−
x
)
并给出
S
S
R
SSR
SSR
的计算式。
Key Concept
一元线性回归的残差性质和回归离差平方和的计算
Explanation
通过最小二乘法的性质,我们可以证明残差的和为零,并且推导出回归离差平方和的计算公式。
Continue to AskSia
© 2023 AskSia.AI all rights reserved
Terms of use
Privacy Policy