Analysis of Random Processes in L2. [@Sholtz, 13.1, 10.2.2, 20.3.4]
For a probability space $(\Omega, \Sigma, P)$, all random variables with finite second moment forms a function space $L^2 (\Omega, \Sigma, P)$. It is a Hilbert space when assigned the inner product $\langle X,Y \rangle = \mathbb{E}[XY]$.
Two random variables are equivalent in $L^2 (\Omega, \Sigma, P)$ if they as measurable mappings are equal almost everywhere relative to the probability measure $P$, denoted as $X=Y \text{ a.e. } P$.
Given space of random variables $L^2 (\Omega, \Sigma, P)$ and metric associated with the $L^2$-norm, convergence in $L^2$ is well defined.
Properties:
Proof of property 2 is depends on property 1.
Convergence of linear transformation of random sequence
Sufficient conditions for convergence of linear transformation of random sequence:
Random process $X(u,t)$ is continuous at $t=0$, if $R_X(t_1,t_2)$ is continuous at $(t_0,t_0)$.
Random process $X(u,t)$ is uniformly continuous, if it is w.s.s. and $R_X(t)$ is continuous at the origin $t=0$.
Random process $X(u,t)$ is differentiable at $t_0$, iff $\frac{\partial^2}{\partial t_1 \partial t_2} R_X (t_1,t_2)$ exists at $(t_0,t_0)$
Random process $X(u,t)$ is differentiable $\forall t \in \mathbb{R}$, if it is w.s.s. and $R_X(t)$ is second-order differentiable at the origin $t=0$.
Properies of differentiator $\mathbb{D}$: