next up previous contents
Next: B9. F-Test Up: B. Optimization Previous: B7. Error Function


B8. F-Distribution

The F-Distribution or Fisher Distribution is named after its inventor R. A. Fisher [43]. If $\vec{x_1} \in \mathbb{R}^{n_1}$ and $\vec{x_2} \in \mathbb{R}^{n_2}$ are random variables that follow the normal distribution with arbitrary means $\mu$ and variances $\sigma$ and s2 is the unbiased estimator of the variance $\sigma$

\begin{displaymath}
s^2 = \sum_{i=1}^{n} \frac{x_i - \hat{x}}{n - 1}
\end{displaymath} (B19)

and $\chi^2$ is the standardized deviation

\begin{displaymath}
\chi^2 = \sum_i^n \left ( \frac{x_i - \mu}{\sigma} \right )
\end{displaymath} (B20)

the fraction F

\begin{displaymath}
F = \frac{\frac{s_1^2}{\sigma_1^2}}{\frac{s_2^2}{\sigma_2^2}} = \frac{\frac{\chi_1^2}{n_1}}{\frac{\chi_2^2}{n_2}}
\end{displaymath} (B21)

is described by the F-Distribution. The probability density of the F-Distribution is

\begin{displaymath}
p(F,n_1,n_2) = \frac{\Gamma \left ( \frac{n_1 + n_2}{2} \rig...
...frac{F^{\frac{n_1 - 2}{2}}}{n_2 + n_1 F^{\frac{n_1 + n_2}{2}}}
\end{displaymath} (B22)

where the function $\Gamma(x)$ is the Euler's Gamma function


\begin{displaymath}
\Gamma(x) = \int_0^\infty t^{x - 1} e^{-t} \ dt
.
\end{displaymath} (B23)

The value calculated from (B.22) represents the probability that the ratio F is calculated from the values of two observations with n1 and n2 sample points is p(F,n1,n2).


next up previous contents
Next: B9. F-Test Up: B. Optimization Previous: B7. Error Function

R. Plasun