2023-07-05 17:53:41 +02:00
\lecture { 9} { } { Percolation, Introduction to characteristic functions}
2023-05-10 18:56:36 +02:00
\subsubsection { Application: Percolation}
2023-07-28 03:45:37 +02:00
We will now discuss another application of \yaref { kolmogorov01} , percolation.
2023-05-10 18:56:36 +02:00
\begin { definition} [\vocab { Percolation} ]
Consider the graph with nodes $ \Z ^ d $ , $ d \ge 2 $ , where edges from the lattice are added with probability $ p $ . The added edges are called \vocab [Percolation!Edge!open] { open} ;
all other edges are called
\vocab [Percolation!Edge!closed] { closed} .
More formally, we consider
\begin { itemize}
\item $ \Omega = \{ 0 , 1 \} ^ { \bE _ d } $ , where $ \bE _ d $ are all edges in $ \Z ^ d $ ,
\item $ \cF \coloneqq \text { product $ \sigma $ - algebra } $ ,
\item $ \bP \coloneqq \left ( p \underbrace { \delta _ { \{ 1 \} } } _ { \text { edge is open } } + ( 1 - p ) \underbrace { \delta _ { \{ 0 \} } } _ { \text { edge is absent closed } } \right ) ^ { \otimes \bE _ d } $ .
\end { itemize}
\end { definition}
\begin { question}
Starting at the origin, what is the probability, that there exists
an infinite path (without moving backwards)?
\end { question}
\begin { definition}
An \vocab { infinite path} consists of an infinite sequence of distinct points
$ x _ 0 , x _ 1 , \ldots $
such that $ x _ n $ is connected to $ x _ { n + 1 } $ , i.e.~the edge $ \{ x _ n, x _ { n + 1 } \} $ is open.
\end { definition}
Let $ C _ \infty \coloneqq \{ \omega | \text { an infinite path exists } \} $ .
\begin { exercise}
Show that changing the presence / absence of finitely many edges
does not change the existence of an infinite path.
Therefore $ C _ \infty $ is an element of the tail $ \sigma $ -algebra.
Hence $ \bP ( C _ \infty ) \in \{ 0 , 1 \} $ .
\end { exercise}
Obviously, $ \bP ( C _ \infty ) $ is monotonic with respect to $ p $ .
For $ d = 2 $ it is known that $ p = \frac { 1 } { 2 } $ is the critical value.
For $ d > 2 $ this is unknown.
% TODO: more in the notes
We'll get back to percolation later.
2023-07-05 17:53:41 +02:00
\section { Characteristic Functions, Weak Convergence and the Central Limit Theorem}
2023-05-10 18:56:36 +02:00
2023-05-16 17:49:09 +02:00
% Characteristic functions are also known as the \vocab{Fourier transform}.
%Weak convergence is also known as \vocab{convergence in distribution} / \vocab{convergence in law}.
2023-05-10 18:56:36 +02:00
So far we have dealt with the average behaviour,
\[
\frac { \overbrace { X_ 1 + \ldots + X_ n} ^ { \text { i.i.d.} } } { n} \to \bE (X_ 1).
\]
2023-07-10 23:56:21 +02:00
We now want to understand fluctuations from the average behaviour,
2023-05-10 18:56:36 +02:00
i.e.\[
X_ 1 + \ldots + X_ n - n \cdot \bE (X_ 1).
\]
% TODO improve
The question is, what happens on other timescales than $ n $ ?
An example is
2023-07-10 23:56:21 +02:00
\begin { equation}
\frac { X_ 1 + \ldots + X_ n - n \bE (X_ 1)} { \sqrt { n} }
\xrightarrow { n \to \infty } \cN (0, \Var (X_ i))
\label { eqn:lec09ast}
\end { equation}
Why is $ \sqrt { n } $ the right order?
Handwavey argument:
2023-05-10 18:56:36 +02:00
2023-07-10 23:56:21 +02:00
Suppose $ X _ 1 , X _ 2 , \ldots $ are i.i.d.~with $ X _ 1 \sim \cN ( 0 , 1 ) $ .
2023-05-10 18:56:36 +02:00
The mean of the l.h.s.~is $ 0 $ and for the variance we get
2023-05-25 00:33:14 +02:00
\begin { IEEEeqnarray*} { rCl}
2023-07-10 23:56:21 +02:00
\Var (\frac { X_ 1 + \ldots + X_ n - n \bE (X_ 1)} { \sqrt { n} } )
& =& \Var \left ( \frac { X_ 1+ \ldots + X_ n} { \sqrt { n} } \right )\\
& =& \frac { 1} { n} \left ( \Var (X_ 1) + \ldots + \Var (X_ n) \right ) = 1
2023-05-25 00:33:14 +02:00
\end { IEEEeqnarray*}
2023-05-10 18:56:36 +02:00
For the r.h.s.~we get a mean of $ 0 $ and a variance of $ 1 $ .
2023-07-10 23:56:21 +02:00
So, to determine what \eqref { eqn:lec09ast} could mean, it is necessary that $ \sqrt { n } $
2023-05-10 18:56:36 +02:00
is the right scaling.
2023-07-10 23:56:21 +02:00
To make \eqref { eqn:lec09ast} precise,
we need another notion of convergence.
2023-05-10 18:56:36 +02:00
This will be the weakest notion of convergence, hence it is called
\vocab { weak convergence} .
2023-07-10 23:56:21 +02:00
This notion of convergence will be defined in terms of
characteristic functions of Fourier transforms.
2023-05-10 18:56:36 +02:00
2023-07-18 15:57:08 +02:00
\subsection { Convolutions$ { } ^ \dagger $ }
\begin { definition} +[Convolution]
2023-07-19 23:54:35 +02:00
Let $ \mu $ and $ \nu $ be probability measures on $ \R ^ d $ .
2023-07-18 15:57:08 +02:00
Then the \vocab { convolution} of $ \mu $ and $ \nu $ ,
$ \mu \ast \nu $ ,
is the probability measure on $ \R ^ d $
2023-07-19 23:54:35 +02:00
defined by
2023-07-18 15:57:08 +02:00
\[
2023-07-19 23:54:35 +02:00
(\mu \ast \nu )(A) = \int _ { \R ^ d} \int _ { \R ^ d} \One _ A(x + y) \mu (\dif x) \nu (\dif y).
2023-07-19 21:03:38 +02:00
\]
2023-07-18 15:57:08 +02:00
\end { definition}
2023-07-19 23:54:35 +02:00
\begin { fact}
If $ \mu $ and $ \nu $ have Lebesgue densities $ f _ \mu $ and $ f _ \nu $ ,
then the convolution has Lebesgue density
\[
2023-07-20 15:13:06 +02:00
f_ { \mu \ast \nu } (x) =
2023-07-19 23:54:35 +02:00
\int _ { \R ^ d} f_ \mu (x - t) f_ \nu (t) \lambda ^ d(\dif t).
\]
\end { fact}
2023-07-18 15:57:08 +02:00
\begin { fact} +[Exercise 6.1]
If $ X _ 1 ,X _ 2 , \ldots $ are independent with
distributions $ X _ 1 \sim \mu _ 1 $ ,
$ X _ 2 \sim \mu _ 2 , \ldots $ ,
then $ X _ 1 + \ldots + X _ n $
has distribution
\[
\mu _ 1 \ast \mu _ 2 \ast \ldots \ast \mu _ n.
2023-07-19 21:03:38 +02:00
\]
2023-07-18 15:57:08 +02:00
\end { fact}
\todo { TODO}
2023-07-05 17:53:41 +02:00
\subsection { Characteristic Functions and Fourier Transform}
2023-05-10 18:56:36 +02:00
2023-05-25 00:33:14 +02:00
\begin { definition}
2023-07-07 17:42:38 +02:00
\label { def:characteristicfunction}
2023-05-25 00:33:14 +02:00
Consider $ ( \R , \cB ( \R ) , \bP ) $ .
The \vocab { characteristic function} of $ \bP $ is defined as
\begin { IEEEeqnarray*} { rCl}
\phi _ { \bP } : \R & \longrightarrow & \C \\
t & \longmapsto & \int _ { \R } e^ { \i t x} \bP (\dif x).
\end { IEEEeqnarray*}
\end { definition}
\begin { abuse}
$ \phi _ \bP ( t ) $ will often be abbreviated as $ \phi ( t ) $ .
\end { abuse}
2023-05-10 18:56:36 +02:00
We have
\[
\phi (t) = \int _ { \R } \cos (tx) \bP (dx) + \i \int _ { \R } \sin (tx) \bP (dx).
\]
\begin { itemize}
\item Since $ |e ^ { \i t x } | \le 1 $ the function $ \phi ( \cdot ) $ is always defined.
\item We have $ \phi ( 0 ) = 1 $ .
\item $ | \phi ( t ) | \le \int _ { \R } |e ^ { \i t x } | \bP ( dx ) = 1 $ .
\end { itemize}
2023-07-18 15:57:08 +02:00
\begin { fact} +
Let $ X $ , $ Y $ be independent random variables
and $ a,b \in \R $ .
Then
\begin { itemize}
\item $ \phi _ { a X + b } ( t ) = e ^ { \i t b } \phi _ X ( \frac { t } { a } ) $ ,
2023-07-19 22:47:29 +02:00
\item $ \phi _ { X + Y } ( t ) = \phi _ X ( t ) \cdot \phi _ Y ( t ) $ .
2023-07-18 15:57:08 +02:00
\end { itemize}
\end { fact}
\begin { proof}
We have
\begin { IEEEeqnarray*} { rCl}
2023-07-18 17:05:34 +02:00
\phi _ { a X + b} (t) & =& \bE [e^{\i t (aX + b)}] \\
2023-07-18 15:57:08 +02:00
& =& e^ { \i t b} \bE [e^{\i t a X}] \\
& =& e^ { \i t b} \phi _ X(\frac { t} { a} ).
\end { IEEEeqnarray*}
Furthermore
\begin { IEEEeqnarray*} { rCl}
\phi _ { X + Y} (t) & =& \bE [e^{\i t (X + Y)}] \\
& =& \bE [e^{\i t X}] \bE [e^{\i t Y}] \\
& =& \phi _ X(t) \phi _ Y(t).
\end { IEEEeqnarray*}
\end { proof}
2023-05-10 18:56:36 +02:00
\begin { remark}
Suppose $ ( \Omega , \cF , \bP ) $ is an arbitrary probability space and
$ X: ( \Omega , \cF ) \to ( \R , \cB ( \R ) ) $ is a random variable.
Then we can define
\[
2023-07-12 16:09:48 +02:00
\phi _ X(t) \coloneqq \bE [e^{\i t X}]
2023-07-12 15:25:52 +02:00
= \int e^ { \i t X(\omega )} \bP (\dif \omega )
= \int _ { \R } e^ { \i t x} \mu (dx) = \phi _ \mu (t),
2023-05-10 18:56:36 +02:00
\]
2023-07-12 15:25:52 +02:00
where $ \mu = \bP \circ X ^ { - 1 } $ .
2023-05-10 18:56:36 +02:00
\end { remark}
\begin { theorem} [Inversion formula] % thm1
2023-07-28 03:45:37 +02:00
\yalabel { Inversion Formula} { Inversion Formula} { inversionformula}
2023-05-10 18:56:36 +02:00
Let $ ( \Omega , \cB ( \R ) , \bP ) $ be a probability space.
Let $ F $ be the distribution function of $ \bP $
(i.e.~$ F ( x ) = \bP ( ( - \infty , x ] ) $ for all $ x \in \R $ ).
Then for every $ a < b $ we have
\begin { eqnarray}
\frac { F(b) + F(b-)} { 2} - \frac { F(a) + F(a-)} { 2} = \lim _ { T \to \infty } \frac { 1} { 2 \pi } \int _ { -T} ^ T \frac { e^ { -\i t b} - e^ { - \i t a} } { - \i t} \phi (t) dt
\label { invf}
\end { eqnarray}
where $ F ( b - ) $ is the left limit.
\end { theorem}
% TODO!
We will prove this later.
\begin { theorem} [Uniqueness theorem] % thm2
2023-07-28 03:45:37 +02:00
\yalabel { Uniqueness Theorem} { Uniqueness} { charfuncuniqueness}
2023-05-10 18:56:36 +02:00
Let $ \bP $ and $ \Q $ be two probability measures on $ ( \R , \cB ( \R ) ) $ .
Then $ \phi _ \bP = \phi _ \Q \implies \bP = \Q $ .
Therefore, probability measures are uniquely determined by their characteristic functions.
Moreover, \eqref { invf} gives a representation of $ \bP $ (via $ F $ )
from $ \phi $ .
\end { theorem}
\begin { refproof} { charfuncuniqueness}
2023-07-28 03:45:37 +02:00
Assume that we have already shown the \yaref { inversionformula} .
2023-05-10 18:56:36 +02:00
Suppose that $ F $ and $ G $ are the distribution functions of $ \bP $ and $ \Q $ .
Let $ a,b \in \R $ with $ a < b $ .
Assume that $ a $ and $ b $ are continuity points of both $ F $ and $ G $ .
2023-07-28 03:45:37 +02:00
By the \yaref { inversionformula} we have
2023-05-10 18:56:36 +02:00
\begin { IEEEeqnarray*} { rCl}
F(b) - F(a) = G(b) - G(a) \label { eq:charfuncuniquefg}
\end { IEEEeqnarray*}
2023-07-28 03:45:37 +02:00
Since $ F $ and $ G $ are monotonic, \yaref { eq:charfuncuniquefg}
2023-05-10 18:56:36 +02:00
holds for all $ a < b $ outside a countable set.
Take $ a _ n $ outside this countable set, such that $ a _ n \ssearrow - \infty $ .
2023-07-28 03:45:37 +02:00
Then, \yaref { eq:charfuncuniquefg} implies that
2023-05-10 18:56:36 +02:00
$ F ( b ) - F ( a _ n ) = G ( b ) - G ( a _ n ) $ hence $ F ( b ) = G ( b ) $ .
Since $ F $ and $ G $ are right-continuous, it follows that $ F = G $ .
\end { refproof}