$\def\NN{\mathbb{N}}$$\def\RR{\mathbb{R}}$$\def\eps{\epsilon}$$\def\calT{\mathcal{T}}$$\def\calP{\mathcal{P}}$$\def\calL{\mathcal{L}}$$\def\calR{\mathcal{R}}$$\def\calC{\mathcal{C}}$$\def\calF{\mathcal{F}}$$\def\calB{\mathcal{B}}$$\def\calS{\mathcal{S}}$$\def\calA{\mathcal{A}}$$\def\calG{\mathcal{G}}$$\newcommand{\inner}[2]{\langle#1, #2\rangle}$$\newcommand{\abs}[1]{\left\vert#1\right\vert}$$\newcommand{\norm}[1]{\left\Vert#1\right\Vert}$$\newcommand{\paren}[1]{\left(#1\right)}$$\newcommand{\sqbracket}[1]{\left[#1\right]}$$\def\var{\text{Var}}$$\def\cov{\text{Cov}}$$\newcommand{\pd}[2]{\frac{\partial #1}{\partial #2}}$$\newcommand{\doublepd}[3]{\frac{\partial^2 #1}{\partial #2 \partial #3}}$
Definition. A sequence $(X_n)_{n=1}^\infty$ of random variables is said to obey the weak law of large numbers (WLLN) if there are real sequences $(a_n)_{n=1}^\infty, (b_n)_{n=1}^\infty$ such that
$$\frac{S_n - a_n}{b_n} \xrightarrow{p} 0$$
where
$$S_n = \sum_{i=1}^n X_i.$$
Theorem. Let $(X_n)_{n=1}^\infty$ be a sequence of iid random variables such that $EX_1^2 < \infty$. Then
$$\overline{X}_n := \frac{X_1 + \ldots + X_n}{n} \xrightarrow{p} EX_1.$$
Proof. Observe that $E\overline{X}_n = EX_1$ and $\var(\overline{X}_n) = \var(X_1) / n$. For each $\eps > 0$, by Chebychev's inequality,
$$P(\abs{\overline{X}_n - EX_1} > \eps) \le \frac{\var(\overline{X}_n)}{\eps^2} = \frac{\var(X_1)}{n \cdot \eps^2} \xrightarrow{p} 0.$$
$\square$
Corollary. Let $(X_n)_{n=1}^\infty$ be a sequence of Bernoulli ($p$) random variables. For each $n \ge 1$, let
$$\hat{p}_n := \frac{\abs{\{1 \le i \le n : X_i = 1\}}}{n}.$$
Then $\hat{p}_n \xrightarrow{p} p$.
Theorem. Let $(X_n)_{n=1}^\infty$ be a sequence of random variables such that
(a) $EX_n^2 < \infty$ for each $n$;
(b) $X_n$'s are uncorrelated, i.e., $EX_iX_j = (EX_i)(EX_j)$ for each $i \ne j$,
(c) $\frac{1}{n^2} \sum_{i=1}^n \var(X_i) \to 0$ as $n \to \infty$.
Then $\overline{X}_n - \overline{\mu}_n \xrightarrow{p} 0$ where $\overline{\mu}_n := \frac{1}{n} \sum_{i=1}^n EX_i$.
Proof of Weierstrass Approximation Theorem using WLLN.
Theorem. Let $f : [0, 1] \to \RR$ be a continuous function. Define $B_{n, f}$ on $[0, 1]$ as
$$B_{n, f}(x) = \sum_{r=0}^n f (r/n) \binom{n}{r} x^r (1-x)^{n-r} \quad (x \in [0, 1]).$$
Then
$$\lim_{n \to \infty} \norm{f - B_{n, f}}_{\sup} = 0.$$
Proof. Let $\eps > 0$. Since $f$ is uniformly continuous, there exists $\delta_\eps > 0$ such that
$$\abs{x - y} < \delta_\eps \implies \abs{f(x) - f(y)} < \eps \quad (x, y \in [0, 1]).$$
Now fix $x \in [0, 1]$ and let $(X_n)_{n=1}^\infty$ be a sequence of iid Bernoulli ($x$) random variables. Note that $B_{n,f}(x) = Ef(\hat{p}_n)$. Now
$$\begin{align*} \abs{f(x) - B_{n,f}(x)} &= \abs{Ef(x) - Ef(\hat{p}_n)}\\ &\le E \abs{f(\hat{p}_n) - f(x)}\\ &= E[\abs{f(\hat{p}_n) - f(x)} \chi_{\abs{\hat{p}_n - x} < \delta_\eps}] + E[\abs{f(\hat{p}_n) - f(x)} \chi_{\abs{\hat{p}_n - x} \ge \delta_\eps}]\\ &\le \eps + 2 \norm{f}_{\sup} P(\abs{\hat{p}_n - x} \ge \delta_\eps)\\ &\le \eps + 2 \norm{f}_{\sup} \cdot \frac{1}{4n \delta_\eps^2} \end{align*}$$
so
$$\norm{f - B_{n,f}}_{\sup} \le \eps + 2 \norm{f}_{\sup} \cdot \frac{1}{4n \delta_\eps^2}.$$
Letting $n \to \infty$ and $\eps \searrow 0$ completes the proof. $\square$
'수학 > 확률론' 카테고리의 다른 글
[Athreya] 7.2. Borel-Cantelli Lemmas, Tail $\sigma$-algebras, and Kolmogorov's 0-1 Law (0) | 2022.08.10 |
---|---|
[Athreya] 7.1. Independent Events and Random Variables (0) | 2022.08.08 |
[Athreya] 6.3. Kolmogorov's consistency theorem (0) | 2022.08.06 |
[Athreya] 6.2. Random Variables and Random Vectors (0) | 2022.08.04 |