According to law the Sample Mean \(\bar X\) of a large number of i.i.d.(independent identically distributed) random variables, with high probability, is very close to the mean, from a distribution for which it was taken.

In other words, if we have a sequence of random variables \(Z_1, Z_2, ... Z_n\) with finite variance \(σ^2\), then the sample mean converges to the mean μ(Though, is correct for infinite variance too).

**Proof:** Lets say we have sequence of random variables with variance \(Var(X_i) = σ^2\), then for sample mean \(\bar X\) applied with Chebyshev inequality we get:

$$P(|\bar X - μ| \ge ϵ) \le {σ^2 \over {nϵ^2}}$$

$$P(|\bar X - μ| < ϵ) \ge 1 - {σ^2 \over {nϵ^2}}, ~~ϵ > 0$$

hence, when n tends to infinity, right side of inequality tends to 1 and:

$$\lim_{n\rightarrow\infty} P(|\bar X - μ| < ϵ) = 1$$

This is same as:

$$P(|\bar X - μ| \ge ϵ) \rightarrow 0, ~~n \rightarrow \infty$$