2023-06-13 17:59:57 +02:00
|
|
|
\lecture{16}{2023-06-13}{}
|
|
|
|
|
2023-06-28 23:40:09 +02:00
|
|
|
% \subsection{Conditional expectation}
|
2023-06-13 17:59:57 +02:00
|
|
|
|
2023-06-28 23:40:09 +02:00
|
|
|
\begin{refproof}{ceroleofindependence}
|
2023-06-13 17:59:57 +02:00
|
|
|
Let $\cH$ be independent of $\sigma(\sigma(X), \cG)$.
|
|
|
|
Then for all $H \in \cH$, we have that $\One_H$
|
2023-07-06 00:36:26 +02:00
|
|
|
and any random variable measurable with respect to either $\sigma(X)$
|
2023-06-13 17:59:57 +02:00
|
|
|
or $\cG$ must be independent.
|
|
|
|
|
|
|
|
It suffices to consider the case of $X \ge 0$.
|
|
|
|
Let $G \in \cG$ and $H \in \cH$.
|
|
|
|
By assumption, $X \One_G$ and $\One_H$ are independent.
|
|
|
|
Let $Z \coloneqq \bE[X | \cG]$.
|
|
|
|
Then
|
|
|
|
\begin{IEEEeqnarray*}{rCl}
|
|
|
|
\underbrace{\bE[X;G \cap H]}_{\coloneqq \int_{G \cap H} X \dif \bP} &=& \bE[(X \One_G) \One_H]\\
|
|
|
|
&=& \bE[X \One_G] \bE[\One_H]\\
|
|
|
|
&=& \bE[Z \One_G] \bP(H)\\
|
|
|
|
&=& \bE[Z; G \cap H]
|
|
|
|
\end{IEEEeqnarray*}
|
|
|
|
|
|
|
|
The identity above means, that the measures $A \mapsto \bE[X; A]$
|
|
|
|
and $A \mapsto \bE[Z; A]$
|
|
|
|
agree on the $\sigma$-algebra $\sigma(\cG, \cH)$ for events
|
|
|
|
of the form $G \cap H$.
|
|
|
|
Since sets of this form generate $\sigma(\cG, \cH)$,
|
|
|
|
these two measures must agree on $\sigma(\cG, \cH)$.
|
|
|
|
The claim of the theorem follows by the uniqueness of conditional expectation.
|
|
|
|
|
|
|
|
To deduce the second statement, choose $\cG = \{\emptyset, \Omega\}$.
|
2023-06-28 23:40:09 +02:00
|
|
|
\end{refproof}
|
2023-06-13 17:59:57 +02:00
|
|
|
|
|
|
|
|
2023-07-05 17:53:41 +02:00
|
|
|
\subsection{The Radon Nikodym Theorem}
|
2023-06-13 17:59:57 +02:00
|
|
|
|
|
|
|
First, let us recall some basic facts:
|
|
|
|
\begin{fact}
|
|
|
|
Let $(\Omega, \cF, \mu)$ be a \vocab[Measure space!$\sigma$-finite]{$\sigma$-finite
|
|
|
|
measure space},
|
|
|
|
i.e.~$\Omega$ can be decomposed into countably many subsets of finite measure.
|
|
|
|
Let $f: \Omega \to [0, \infty)$ be measurable.
|
|
|
|
Define $\nu(A) \coloneqq \int_A f \dif \mu$.
|
|
|
|
Then $\nu$ is also a $\sigma$-finite measure on $(\Omega, \cF)$.\todo{Application of mct}
|
|
|
|
Moreover, $\nu$ is finite iff $f$ is integrable.
|
|
|
|
\end{fact}
|
|
|
|
Note that in this setting, if $\mu(A) = 0$ it follows that $\nu(A) = 0$.
|
|
|
|
|
|
|
|
The Radon Nikodym theorem is the converse of that:
|
|
|
|
\begin{theorem}[Radon-Nikodym]
|
|
|
|
\label{radonnikodym}
|
|
|
|
|
|
|
|
Let $\mu$ and $\nu$ be two $\sigma$-finite measures
|
|
|
|
on $(\Omega, \cF)$.
|
|
|
|
Suppose
|
|
|
|
\[
|
|
|
|
\forall A \in \cF . ~ \mu(A) = 0 \implies \nu(A) = 0.
|
2023-07-06 00:36:26 +02:00
|
|
|
\]
|
2023-06-13 17:59:57 +02:00
|
|
|
Then
|
|
|
|
\begin{enumerate}[(1)]
|
|
|
|
\item there exists $Z: \Omega \to [0, \infty)$ measurable,
|
|
|
|
such that
|
|
|
|
\[\forall A \in \cF . ~ \nu(A) = \int_A Z \dif \mu.\]
|
|
|
|
\item Such a $Z$ is unique up to equality a.e.~(w.r.t. $\mu$).
|
|
|
|
\item $Z$ is integrable w.r.t.~$ \mu$ iff $\nu$ is a finite measure.
|
|
|
|
\end{enumerate}
|
|
|
|
|
|
|
|
Such a $Z$ is called the \vocab{Radon-Nikodym derivative}.
|
|
|
|
\end{theorem}
|
|
|
|
\begin{definition}
|
2023-06-29 22:18:23 +02:00
|
|
|
Whenever the property $\forall A \in \cF, \mu(A) = 0 \implies \nu(A) = 0$
|
|
|
|
holds for two measures $\mu$ and $\nu$,
|
2023-07-06 00:36:26 +02:00
|
|
|
we say that $\nu$ is \vocab{absolutely continuous}
|
2023-06-13 17:59:57 +02:00
|
|
|
w.r.t.~$\mu$.
|
|
|
|
This is written as $\nu \ll \mu$.
|
|
|
|
\end{definition}
|
2023-07-18 23:58:27 +02:00
|
|
|
\begin{definition}+
|
|
|
|
Two measures $\mu$ and $\nu$ on a measure space $(\Omega, \cF)$
|
|
|
|
are called \vocab{singular},
|
|
|
|
denoted $\mu \bot \nu$,
|
|
|
|
iff there exists $A \in \cF$ such that
|
|
|
|
\[
|
|
|
|
\mu(A) = \nu(A^c) = 0.
|
|
|
|
\]
|
|
|
|
\end{definition}
|
|
|
|
|
2023-06-13 17:59:57 +02:00
|
|
|
|
|
|
|
With \autoref{radonnikodym} we get a very short proof of the existence
|
|
|
|
of conditional expectation:
|
|
|
|
\begin{proof}[Second proof of \autoref{conditionalexpectation}]
|
|
|
|
Let $(\Omega, \cF, \bP)$ as always, $X \in L^1(\bP)$ and $\cG \subseteq \cF$.
|
|
|
|
It suffices to consider the case of $X \ge 0$.
|
|
|
|
For all $G \in \cG$, define $\nu(G) \coloneqq \int_G X \dif \bP$.
|
2023-07-06 00:36:26 +02:00
|
|
|
Obviously, $\nu \ll \bP$ on $\cG$.
|
2023-06-13 17:59:57 +02:00
|
|
|
Then apply \autoref{radonnikodym}.
|
|
|
|
\end{proof}
|
|
|
|
|
|
|
|
|
|
|
|
\begin{refproof}{radonnikodym}
|
2023-06-29 22:18:23 +02:00
|
|
|
We will only sketch the proof.
|
|
|
|
A full proof can be found in the official notes.
|
2023-07-06 00:36:26 +02:00
|
|
|
|
2023-06-29 22:18:23 +02:00
|
|
|
\paragraph{Step 1: Uniqueness} \notes
|
2023-06-13 17:59:57 +02:00
|
|
|
\paragraph{Step 2: Reduction to the finite measure case}
|
2023-06-29 22:18:23 +02:00
|
|
|
\notes
|
2023-06-13 17:59:57 +02:00
|
|
|
\paragraph{Step 3: Getting hold of $Z$}
|
|
|
|
Assume now that $\mu$ and $\nu$ are two finite measures.
|
|
|
|
Let
|
2023-06-29 22:18:23 +02:00
|
|
|
\[
|
|
|
|
\cC \coloneqq \left\{f: \Omega \to [0,\infty] \middle| %
|
|
|
|
\forall A \in \cF.~\int_A f \dif \mu \le \nu(A)\right\}.
|
|
|
|
\]
|
2023-06-13 17:59:57 +02:00
|
|
|
|
|
|
|
We have $\cC \neq \emptyset$ since $0 \in \cC$.
|
|
|
|
The goal is to find a maximal function $Z$ in $\cC$.
|
|
|
|
Obviously its integral will also be maximal.
|
|
|
|
|
|
|
|
\begin{enumerate}[(a)]
|
|
|
|
\item If $f,g \in \cC$, than $f \lor g$ (the pointwise maximum)
|
|
|
|
s also in $\cC$.
|
|
|
|
\item Suppose $\{f_n\}_{n \ge 1}$ is an increasing sequence in $\cC$.
|
|
|
|
Let $f$ be the pointwise limit.
|
|
|
|
Then $f \in \cC$.
|
|
|
|
\item For all $f \in \cC$, we have
|
|
|
|
\[
|
|
|
|
\int_\Omega f \dif \mu \le \nu(\Omega) < \infty.
|
2023-07-06 00:36:26 +02:00
|
|
|
\]
|
2023-06-13 17:59:57 +02:00
|
|
|
\end{enumerate}
|
|
|
|
|
|
|
|
Define $\alpha \coloneqq \sup \{ \int f \dif \mu : f \in \cC\} \le \nu(\Omega) < \infty$.
|
|
|
|
Let $f_n \in \cC, n\in \N$ be a sequence
|
|
|
|
with $\int f_n \dif \mu \to \alpha$.
|
|
|
|
Define $g_n \coloneqq \max \{f_1,\ldots,f_n\} \in \cC$.
|
|
|
|
Applying (b), we get that the pointwise limit, $Z$,
|
|
|
|
is an element of $\cC$.
|
|
|
|
|
|
|
|
\paragraph{Step 4: Showing that our choice of $Z$ works}
|
|
|
|
Define $\lambda(A) \coloneqq \nu(A) - \int_A Z \dif \mu \ge 0$.
|
|
|
|
$\lambda$ is a measure.
|
|
|
|
\begin{claim}
|
|
|
|
$\lambda = 0$.
|
|
|
|
\end{claim}
|
|
|
|
\begin{subproof}
|
|
|
|
Call $G \in \cF$ \emph{good} if the following hold:
|
|
|
|
\begin{enumerate}[(i)]
|
|
|
|
\item $\lambda(G) - \frac{1}{k}\mu(G) > 0$.
|
|
|
|
\item $\forall B \subseteq G, B \in \cF. ~ \lambda(B) - \frac{1}{k}\mu(B) \ge 0$.
|
|
|
|
\end{enumerate}
|
|
|
|
Suppose we know that for all $A \in \cF, k \in \N$
|
|
|
|
we have
|
|
|
|
$\lambda(A) \le \frac{1}{k} \mu(A)$.
|
|
|
|
Then $\lambda(A) = 0$ since $\mu$ is finite.
|
|
|
|
|
|
|
|
Assume the claim does not hold.
|
2023-07-06 00:36:26 +02:00
|
|
|
Then there must be some $k \in \N$, $A \in \cF$
|
2023-06-13 17:59:57 +02:00
|
|
|
such that $\lambda(A) - \frac{1}{k} \mu(A) > 0$.
|
|
|
|
Fix this $A$ and $k$.
|
|
|
|
Then $A$ satisfies condition (i) of being good,
|
|
|
|
but it need not satisfy (ii).
|
|
|
|
|
|
|
|
The tricky part is to make $A$ smaller such that it also
|
2023-06-29 22:18:23 +02:00
|
|
|
satisfies (ii).\notes
|
2023-06-13 17:59:57 +02:00
|
|
|
\end{subproof}
|
|
|
|
\end{refproof}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
\section{Martingales}
|
|
|
|
|
2023-06-29 22:18:23 +02:00
|
|
|
\subsection{Definition}
|
|
|
|
|
2023-06-13 17:59:57 +02:00
|
|
|
We have already worked with martingales, but we will define them
|
|
|
|
rigorously now.
|
|
|
|
|
|
|
|
\begin{definition}[Filtration]
|
|
|
|
A \vocab{filtration} is a sequence $(\cF_n)$ of $\sigma$-algebras
|
|
|
|
such that $\cF_n \subseteq \cF_{n+1}$ for all $n \ge 1$.
|
|
|
|
\end{definition}
|
|
|
|
Intuitively, we can think of a $\cF_n$ as the set of information
|
|
|
|
we have gathered up to time $n$.
|
|
|
|
Typically $\cF_n = \sigma(X_1, \ldots, X_n)$ for a sequence of random variables.
|
|
|
|
|
|
|
|
\begin{definition}
|
2023-07-18 23:58:27 +02:00
|
|
|
\label{def:martingale}
|
2023-06-13 17:59:57 +02:00
|
|
|
Let $(\cF_n)$ be a filtration and
|
|
|
|
$X_1,\ldots,X_n$ be random variables such that $X_i \in L^1(\bP)$.
|
|
|
|
Then we say that $(X_n)_{n \ge 1}$ is an $(\cF_n)_n$-\vocab{martingale}
|
2023-06-29 22:18:23 +02:00
|
|
|
if the following hold:
|
2023-06-13 17:59:57 +02:00
|
|
|
\begin{itemize}
|
2023-06-29 22:18:23 +02:00
|
|
|
\item $X_n$ is $\cF_n$-measurable for all $n$.
|
|
|
|
|
|
|
|
($X_n$ is \vocab[Sequence!adapted to a filtration]{adapted to the filtration} $\cF_n$ ).
|
2023-06-13 17:59:57 +02:00
|
|
|
\item $\bE[X_{n+1} | \cF_n] \overset{\text{a.s.}}{=} X_n$
|
|
|
|
for all $n$.
|
|
|
|
\end{itemize}
|
|
|
|
|
2023-07-17 20:59:06 +02:00
|
|
|
$(X_n)_n$ is called a \vocab{submartingale},
|
2023-06-13 17:59:57 +02:00
|
|
|
if it is adapted to $\cF_n$ but
|
|
|
|
\[
|
|
|
|
\bE[X_{n+1} | \cF_n] \overset{\text{a.s.}}{\ge} X_n.
|
2023-07-06 00:36:26 +02:00
|
|
|
\]
|
2023-07-17 20:59:06 +02:00
|
|
|
It is called a \vocab{supermartingale}
|
2023-06-13 17:59:57 +02:00
|
|
|
if it is adapted but $\bE[X_{n+1} | \cF_n] \overset{\text{a.s.}}{\le} X_n$.
|
|
|
|
\end{definition}
|
|
|
|
\begin{corollary}
|
2023-07-17 20:59:06 +02:00
|
|
|
\label{cor:convexmartingale}
|
2023-06-29 22:18:23 +02:00
|
|
|
Suppose that $f: \R \to \R$ is a convex function such that $f(X_n) \in L^1(\bP)$.
|
2023-07-16 01:15:14 +02:00
|
|
|
Suppose that $(X_n)_n$ is a martingale%
|
|
|
|
\footnote{In this form it means, that there is some filtration,
|
|
|
|
that we don't explicitly specify}.
|
2023-07-17 20:59:06 +02:00
|
|
|
Then $(f(X_n))_n$ is a submartingale.
|
|
|
|
Likewise, if $f$ is concave, then $((f(X_n))_n$ is a supermartingale.
|
2023-06-13 17:59:57 +02:00
|
|
|
\end{corollary}
|
|
|
|
\begin{proof}
|
2023-06-29 22:18:23 +02:00
|
|
|
Apply \autoref{cjensen}.
|
2023-06-13 17:59:57 +02:00
|
|
|
\end{proof}
|
|
|
|
|
|
|
|
\begin{corollary}
|
|
|
|
If $(X_n)_n$ is a martingale,
|
|
|
|
then $\bE[X_n] = \bE[X_0]$.
|
|
|
|
\end{corollary}
|
|
|
|
|
|
|
|
|
|
|
|
\begin{example}
|
2023-06-15 17:51:32 +02:00
|
|
|
The simple random walk:
|
2023-06-13 17:59:57 +02:00
|
|
|
|
|
|
|
Let $\xi_1, \xi_2, ..$ iid,
|
|
|
|
$\bP[\xi_i = 1] = \bP[\xi_i = -1] = \frac{1}{2}$,
|
|
|
|
$X_n \coloneqq \xi_1 + \ldots + \xi_n$
|
|
|
|
and $\cF_n \coloneqq \sigma(\xi_1, \ldots, \xi_n) = \sigma(X_1, \ldots, X_n)$.
|
|
|
|
Then $X_n$ is $\cF_n$-measurable.
|
|
|
|
Showing that $(X_n)_n$ is a martingale
|
|
|
|
is left as an exercise.
|
2023-06-15 17:51:32 +02:00
|
|
|
\end{example}
|
|
|
|
\begin{example}
|
|
|
|
See exercise sheet 9.
|
|
|
|
\todo{Copy}
|
2023-06-13 17:59:57 +02:00
|
|
|
\end{example}
|