next up previous contents
Next: B. Layout Data Formats Up: A. Basis of the Previous: A.1 The GAUSSIAN Normal

Subsections


A.2 The Central Limit Theorem

A.2.1 Theorem

Let $ X_1,X_2,\ldots,X_N$ be a set of $ N$ independent random variates and $ X_i$ have an arbitrary probability distribution $ P(x_1,\ldots,x_N)$ with mean $ \mu_i$ and a finite variance $ \sigma_i^2$.
Two variates A and B are statistically independent if the conditional probability $ P(A\vert B)=\frac{P(A \cap B)}{P(B)}$ (probability of an event A assuming that B has occurred) of A given B satisfies

$\displaystyle P(A\vert B)=P(A)$ (A.15)

in which case the probability of A and B is just

$\displaystyle P(A B) = P(A \cap B)=P(A) P(B)$ (A.16)

Similarly, n events $ A_1,A_2,\ldots,A_n$ are independent if

$\displaystyle P\left(\bigcap_{i=1}^n A_i\right) = \prod^n_{i=1} P(A_i)$ (A.17)

Then the normal form variate

$\displaystyle X_{norm}=\frac{\sum^N_{i=1} x_i - \sum^N_{i=1} \mu_i}{\sqrt{\sum^N_{i=1} \sigma_i^2}}$ (A.18)

has a limiting cumulative distribution function which approaches a normal distribution.
Under additional conditions on the distribution of the variates, the probability density itself is also normal with mean $ \mu = 0$ and variance $ \sigma^2 = 1$. If conversion to normal form is not performed, then the variate

$\displaystyle X \equiv \frac{1}{N} \sum^N_{i=1} x_i$ (A.19)

is normally distributed with $ \mu_X = \mu_x$ and $ \sigma_X =
\frac{\sigma_x}{\sqrt{N}}$.

A.2.2 Proof

Consider the inverse FOURIER transform of $ P_X(f)$.

$\displaystyle {\cal F}^{-1}_f[P_X(f)](x) \equiv \int_{-\infty}^{\infty} e^{2 \pi \imath f X} P(X) dX$    
$\displaystyle = \int_{-\infty}^{\infty}\sum_{n=0}^{\infty}\frac{(2 \pi \imath f X)^n}{n!} P(X) dX$    
$\displaystyle = \sum_{x=0}^{\infty} \frac{(2 \pi \imath f)^n}{n!}\int_{-\infty}^{\infty}X^n P(X) dX$    
$\displaystyle = \sum_{x=0}^{\infty}\frac{(2 \pi \imath f)^n}{n!}\langle X^n \rangle.$ (A.20)

Now write

$\displaystyle \langle X^n \rangle = \langle N^{-n} ( x_1 + x_2 + \ldots + x_N )^n \rangle =$ (A.21)
$\displaystyle \int_{-\infty}^{\infty} N^{-n} (x_1 + \ldots + x_N )^n P(x_1) \cdots P(x_N) dx_1 \cdots dx_N,$ (A.22)

so we have

$\displaystyle {\cal F}^{-1}_f[P_X(f)](x) = \sum_{x=0}^\infty \frac{(2 \pi \imath f)^2}{n!} \langle X^n \rangle$    
$\displaystyle = \sum_{x=0}^\infty \frac{(2 \pi \imath f)^2}{n!} \int_{-\infty}^...
...fty} N^{-n} (x_1 + \ldots + x_N)^n \times P(x_1) \cdots P(x_N) dx_1 \cdots dx_N$    
$\displaystyle = \int_{-\infty}^{\infty} \sum_{n=0}^{\infty}\left[\frac{2 \pi \i...
...+ \ldots + x_N)}{N}\right]^n \frac{1}{n!} P(x_1) \cdots P(x_N) dx_1 \cdots dx_N$    
$\displaystyle = \int_{-\infty}^{\infty} e^{\frac{2 \pi \imath f(x_1 + \cdots + x_N)}{N}}P(x_1) \cdots P(x_N) dx_1 \cdots dx_N$    
$\displaystyle = \left[\int_{-\infty}^{\infty} e^{\frac{2 \pi \imath f x_1}{N}} ...
...eft[\int_{-\infty}^{\infty} e^{\frac{2 \pi \imath f x_N}{N}} P(x_N) dx_N\right]$    
$\displaystyle = \left[\int_{-\infty}^{\infty} e^{\frac{2 \pi \imath f x}{N}} P(x) dx\right]^N$    
$\displaystyle = \left\{ \int_{-\infty}^{\infty}\left[1+\left(\frac{2 \pi \imath...
...\left(\frac{2 \pi \imath f}{N}\right)^2 x^2 + \ldots \right] P(x) dx \right\}^N$    
$\displaystyle = \left[1+\frac{2 \pi \imath f}{N} \langle x \rangle - \frac{(2 \pi f)^2}{2 N^2} \langle x^2 \rangle + {\cal O}(N^{-3})\right]^N$    
$\displaystyle = e^{\textstyle N \ln{\left[1+\frac{2 \pi \imath f}{N} \langle x ...
...le - \frac{(2 \pi f)^2}{2 N^2} \langle x^2 \rangle + {\cal O}(N^{-3})\right]}}.$ (A.23)

Now expand

$\displaystyle \ln{(1+x)} = x - \frac{1}{2} x^2 + \frac{1}{3} x^3 + \ldots,$ (A.24)

so

$\displaystyle {\cal F}^{-1}_f[P_X(f)](x) \approx e^{\textstyle N \left[\frac{2 ...
...}{2}\frac{(2 \pi \imath f)^2}{N^2} \langle x \rangle^2+{\cal O}(N^{-3})\right]}$    
$\displaystyle = e^{\textstyle 2 \pi \imath f \langle x \rangle - \frac{(2 \pi f)^2(\langle x^2 \rangle-\langle x \rangle^2)}{2 N} + {\cal{O}}(N^{-2})}$    
$\displaystyle \approx e^{\textstyle 2 \pi \imath f \mu_x - \frac{(2 \pi f)^2 \sigma_x^2}{2 N}},$ (A.25)

since

$\displaystyle \mu_x \equiv \langle x \rangle$ (A.26)
$\displaystyle \sigma_x^2 \equiv \langle x^2 \rangle-\langle x \rangle^2$ (A.27)

Taking the FOURIER transform

$\displaystyle P_X \equiv \int_{-\infty}^{\infty} e^{-2 \pi \imath f x} {\cal F}^{-1}\left[P_X(f)\right] df$    
$\displaystyle = \int_{-\infty}^{\infty} e^{2 \pi \imath f(\mu_x-x)-(2 \pi f )^2 \frac{\sigma_x^2}{2N}} df.$ (A.28)

This is of the form

$\displaystyle \int_{-\infty}^{\infty}e^{\imath a f - b f^2} df,$ (A.29)

where $ a \equiv 2 \pi (\mu_x - x)$ and $ b \equiv \frac{(2 \pi
\sigma_x)^2}{2N}$. This integral yields

$\displaystyle \int_{-\infty}^{\infty} e^{\imath a f - b f^2} df = e^{-\frac{a^2}{4b}} \sqrt{\frac{\pi}{b}}$ (A.30)

Therefore

$\displaystyle P_X = \sqrt{\frac{\pi}{\frac{(2 \pi \sigma_x)^2}{2N}}} e^{\textstyle \frac{-[2 \pi (\mu_x - x)]^2}{4 \frac{(2 \pi \sigma_x)^2}{2N}}}$    
$\displaystyle = \sqrt{\frac{2 \pi N}{4 \pi^2 \sigma_x^2}} e^{\textstyle -\frac{4 \pi^2 (\mu_x-x)^2 2 N}{4 \cdot 4 \pi^2 \sigma_x^2}}$    
$\displaystyle = \frac{\sqrt{N}}{\sigma_x\sqrt{2 \pi}} e^{\textstyle -\frac{(\mu_x - x)^2 N}{2 \sigma_x^2}}.$ (A.31)

But $ \sigma_X=\frac{\textstyle \sigma_x}{\sqrt{N}}$ and $ \mu_X = \mu_x$, so

$\displaystyle P_X=\frac{1}{\sigma_X \sqrt{2 \pi}} e^{\textstyle -\frac{(\mu_X - x)^2}{2 \sigma_X^2}}$ (A.32)

The ``fuzzy'' central limit theorem says that data which are influenced by many small and unrelated random effects are approximately normally distributed.

next up previous contents
Next: B. Layout Data Formats Up: A. Basis of the Previous: A.1 The GAUSSIAN Normal

R. Minixhofer: Integrating Technology Simulation into the Semiconductor Manufacturing Environment