2023-07-10 12:20:52 +02:00
|
|
|
\lecture{23}{2023-07-06}{Recap}
|
|
|
|
\subsection{Recap}
|
2023-07-07 17:42:38 +02:00
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsection{Construction of iid random variables.}
|
2023-07-07 17:42:38 +02:00
|
|
|
|
|
|
|
\begin{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Definition of a consistent family (\yaref{def:consistentfamily})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Important construction:
|
|
|
|
|
|
|
|
Consider a distribution function $F$ and define
|
|
|
|
\[
|
|
|
|
\prod_{i=1}^n (F(b_i) - F(a_i)) \text{\reflectbox{$\coloneqq$}}
|
|
|
|
\mu_n \left( (a_1,b_1] \times x \ldots \times x (a_n, b_n] \right).
|
|
|
|
\]
|
|
|
|
|
|
|
|
\item Examples of consistent and inconsistent families
|
|
|
|
\todo{Exercises}
|
|
|
|
\item Kolmogorov's consistency theorem
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{thm:kolmogorovconsistency})
|
2023-07-07 17:42:38 +02:00
|
|
|
\end{itemize}
|
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsection{Limit theorems}
|
2023-07-07 17:42:38 +02:00
|
|
|
\begin{itemize}
|
|
|
|
\item Work with iid.~random variables.
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Notions of convergence (\yaref{def:convergence})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Implications between different notions of convergence (very important) and counter examples.
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{thm:convergenceimplications})
|
2023-07-07 17:42:38 +02:00
|
|
|
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Laws of large numbers: (\yaref{lln})
|
2023-07-07 17:42:38 +02:00
|
|
|
\begin{itemize}
|
|
|
|
\item WLLN: convergence in probability
|
|
|
|
\item SLLN: weak convergence
|
|
|
|
\end{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \yaref{thm2} (building block for SLLN):
|
2023-07-07 17:42:38 +02:00
|
|
|
Let $(X_n)$ be independent with mean $0$ and $\sum \sigma_n^2 < \infty$,
|
|
|
|
then $ \sum X_n $ converges a.s.
|
|
|
|
\begin{itemize}
|
|
|
|
\item Counter examples showing that $\impliedby$ does not hold in general are important
|
|
|
|
\item $\impliedby$ holds for iid.~uniformly bounded random variables
|
|
|
|
\item Application:
|
|
|
|
|
|
|
|
$\sum_{i=1}^{\infty} \frac{(\pm_1)}{n^{\frac{1}{2} + \epsilon}}$ converges a.s.~for all $\epsilon > 0$.
|
|
|
|
|
|
|
|
$\sum \frac{\pm 1}{ n^{\frac{1}{2} -\epsilon}}$ does not converge a.s.~for any $\epsilon > 0$.
|
|
|
|
\end{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \yaref{thm:kolmogorovineq}
|
|
|
|
\item \yaref{kolmogorov01}
|
2023-07-07 17:42:38 +02:00
|
|
|
|
|
|
|
In particular, a series of independent random variables converges with probability $0$ or $1$.
|
|
|
|
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \yaref{thm:kolmogorovthreeseries}
|
2023-07-07 17:42:38 +02:00
|
|
|
\begin{itemize}
|
|
|
|
\item What are those $3$ series?
|
|
|
|
\item Applications
|
|
|
|
\end{itemize}
|
|
|
|
\end{itemize}
|
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsubsection{Fourier transform / characteristic functions / weak convegence}
|
2023-07-07 17:42:38 +02:00
|
|
|
|
|
|
|
\begin{itemize}
|
|
|
|
\item Definition of Fourier transform
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{def:characteristicfunction})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item The Fourier transform uniquely determines the probability distribution.
|
|
|
|
It is bounded, so many theorems are easily applicable.
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \yaref{charfuncuniqueness},
|
|
|
|
\yaref{inversionformula}, ...
|
|
|
|
\item \yaref{levycontinuity},
|
|
|
|
\yaref{genlevycontinuity}
|
|
|
|
\item \yaref{thm:bochner}
|
|
|
|
\item \yaref{bochnersformula}
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Related notions
|
|
|
|
\todo{TODO}
|
|
|
|
\begin{itemize}
|
|
|
|
\item Laplace transforms $\bE[e^{-\lambda X}]$ for some $\lambda > 0$
|
|
|
|
(not done in the lecture, but still useful).
|
|
|
|
\item Moments $\bE[X^k]$ (not done in the lecture, but still useful)
|
|
|
|
All moments together uniquely determine the distribution.
|
|
|
|
\end{itemize}
|
|
|
|
\end{itemize}
|
|
|
|
|
|
|
|
\paragraph{Weak convergence}
|
|
|
|
\begin{itemize}
|
|
|
|
\item Definition of weak convergence % ( test against continuous, bounded functions).
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{def:weakconvergence})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Examples:
|
|
|
|
\begin{itemize}
|
|
|
|
\item $(\delta_{\frac{1}{n}})_n$,
|
2023-07-20 22:28:31 +02:00
|
|
|
\item $(\frac{1}{2} \delta_{-\frac{1}{n}} + \frac{1}{2} \delta_{\frac{1}{n}})_n$,
|
2023-07-07 17:42:38 +02:00
|
|
|
\item $(\cN(0, \frac{1}{n}))_n$,
|
|
|
|
\item $(\frac{1}{n} \delta_n + (1- \frac{1}{n}) \delta_{\frac{1}{n}})_n$.
|
|
|
|
\end{itemize}
|
|
|
|
|
|
|
|
\item Non-examples: $(\delta_n)_n$
|
|
|
|
\item How does one prove weak convergence? How does one write this down in a clear way?
|
2023-07-18 15:57:08 +02:00
|
|
|
\begin{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \yaref{lec10_thm1},
|
|
|
|
\item \yaref{levycontinuity},
|
2023-07-18 15:57:08 +02:00
|
|
|
\item Generalization of Levy's continuity theorem
|
|
|
|
\ref{genlevycontinuity}
|
|
|
|
\end{itemize}
|
2023-07-07 17:42:38 +02:00
|
|
|
\end{itemize}
|
|
|
|
|
|
|
|
\paragraph{Convolution}
|
|
|
|
\begin{itemize}
|
|
|
|
\item Definition of convolution.
|
2023-07-10 12:20:52 +02:00
|
|
|
\todo{Copy from exercise sheet and write a subsection about this}
|
2023-07-07 17:42:38 +02:00
|
|
|
\item $X_i \sim \mu_i \text{ iid. }\implies X_1 + \ldots + X_n \sim \mu_1 \ast \ldots \ast \mu_n$.
|
|
|
|
\end{itemize}
|
|
|
|
|
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsubsection{CLT}
|
2023-07-07 17:42:38 +02:00
|
|
|
\begin{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Statement of the \yaref{clt}
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Several versions:
|
|
|
|
\begin{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item iid,
|
|
|
|
\item \yaref{lindebergclt},
|
|
|
|
\item \yaref{lyapunovclt}
|
2023-07-07 17:42:38 +02:00
|
|
|
\end{itemize}
|
|
|
|
\item How to apply this? Exercises!
|
|
|
|
\end{itemize}
|
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsection{Conditional expectation}
|
2023-07-07 17:42:38 +02:00
|
|
|
\begin{itemize}
|
|
|
|
\item Definition and existence of conditional expectation for $X \in L^1(\Omega, \cF, \bP)$
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{conditionalexpectation})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item If $H = L^2(\Omega, \cF, \bP)$, then $\bE[ \cdot | \cG]$
|
|
|
|
is the (unique) projection on the closed subspace $L^2(\Omega, \cG, \bP)$.
|
|
|
|
Why is this a closed subspace? Why is the projection orthogonal?
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \yaref{radonnikodym}
|
2023-07-18 23:58:27 +02:00
|
|
|
(Proof not relevant for the exam)
|
2023-07-07 17:42:38 +02:00
|
|
|
\item (Non-)examples of mutually absolutely continuous measures
|
|
|
|
Singularity in this context? % TODO
|
|
|
|
\end{itemize}
|
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsection{Martingales}
|
2023-07-07 17:42:38 +02:00
|
|
|
|
|
|
|
\begin{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Definition of Martingales (\yaref{def:martingale})
|
|
|
|
\item Doob's convergence theorem (\yaref{doobmartingaleconvergence}),
|
|
|
|
Upcrossing inequality (\yaref{lec17l1}, \yaref{lec17l2}, \yaref{lec17l3})
|
2023-07-07 17:42:38 +02:00
|
|
|
(downcrossings for submartingales)
|
|
|
|
\item Examples of Martingales converging a.s.~but not in $L^1$
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{ex:martingale-not-converging-in-l1})
|
2023-07-18 23:58:27 +02:00
|
|
|
\item Bounded in $L^2$ $\implies$ convergence in $L^2$
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{martingaleconvergencel2}).
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Martingale increments are orthogonal in $L^2$!
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{martingaleincrementsorthogonal})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item Doob's (sub-)martingale inequalities
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{dooblp}),
|
2023-07-07 17:42:38 +02:00
|
|
|
\item $\bP[\sup_{k \le n} M_k \ge x]$ $\leadsto$ Look at martingale inequalities!
|
|
|
|
Estimates might come from Doob's inequalities if $(M_k)_k$ is a (sub-)martingale.
|
2023-07-18 23:58:27 +02:00
|
|
|
\item Doob's $L^p$ convergence theorem
|
2023-07-28 03:45:37 +02:00
|
|
|
(\yaref{ceismartingale}, \yaref{martingaleisce}).
|
2023-07-07 17:42:38 +02:00
|
|
|
\begin{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Why is $p > 1$ important? \textbf{Role of \yaref{banachalaoglu}}
|
2023-07-07 17:42:38 +02:00
|
|
|
\item This is an important proof.
|
|
|
|
\end{itemize}
|
2023-07-28 03:45:37 +02:00
|
|
|
\item Uniform integrability (\yaref{def:ui})
|
|
|
|
\item What are stopping times? (\yaref{def:stopping-time})
|
2023-07-07 17:42:38 +02:00
|
|
|
\item (Non-)examples of stopping times
|
2023-07-28 03:45:37 +02:00
|
|
|
\item \textbf{\yaref{optionalstopping}}
|
2023-07-18 23:58:27 +02:00
|
|
|
- be really comfortable with this.
|
2023-07-07 17:42:38 +02:00
|
|
|
\end{itemize}
|
|
|
|
|
|
|
|
|
2023-07-10 12:20:52 +02:00
|
|
|
\subsubsection{Markov Chains}
|
2023-07-07 17:42:38 +02:00
|
|
|
|
|
|
|
\begin{itemize}
|
|
|
|
\item What are Markov chains?
|
|
|
|
\item State space, initial distribution
|
|
|
|
\item Important examples
|
|
|
|
\item \textbf{What is the relation between Martingales and Markov chains?}
|
|
|
|
$u$ \vocab{harmonic} $\iff Lu = 0$.
|
|
|
|
(sub-/super-) harmonic $u$ $\iff$ for a Markov chain $(X_n)$,
|
|
|
|
$u(X_n)$ is a (sub-/super-)martingale
|
|
|
|
\item Dirichlet problem
|
|
|
|
(Not done in the lecture)
|
|
|
|
\item ... (more in Probability Theory II)
|
|
|
|
\end{itemize}
|