Univariate Transformation

Theorem: (Monotone Transformation) r.v. \( X \sim F_X(x) \), and r.v. Y is related to X by \( Y=g(X) \).

  1. If \( g(\cdot) \) is increasing on \( \text{supp}(X) \), then \( Y \sim F_X(g^{-1}(y)) \) for \( y \in \text{supp}(Y) \).
  2. If \( g(\cdot) \) is decreasing on \( \text{supp}(X) \) and X is a continuous r.v., then \( Y \sim 1 - F_X(g^{-1}(y)) \) for \( y \in \text{supp}(Y) \).

Corollary: (Probability Integral Transformation)

  1. If continuous r.v. \( X \sim F_X(x) \), then \( F_X(X) \sim U(0,1) \).
  2. If \( U \sim U(0,1) \) and \( F(\cdot) \) is a CDF, then \( F^{-1}(U) \sim F \)

Multivariate Transformation

The following methods assumes transformations of the form \( Z = g(X,Y) \), and can be applied to general multivariate transformations.

Direct Method

\[ F_Z (z) = P( g(X,Y) \leq z ) = P( (X,Y) \in G_z ) = \iint_{G_z} f_{X,Y}(x,y) \text{d}x \text{d}y \]

Here \( G_z = \{ (X,Y) | g(X,Y) \leq z \} \).

Theorem: (Summation of Independent R.V.'s) If r.v.'s \( X ∐ Y \) and \( Z = X+Y \), then

  1. \( f_Z = f_X * f_Y \)
  2. \( \Phi_Z = \Phi_X \cdot \Phi_Y \)

Conditional Method

  1. Find conditional PDF \( f_{Z|X} \) from univariate transformation \( Z|X = g(X,Y|X) \).
  2. PDF of Z can be found by \( f_Z(z) = \mathbb{E} f_{Z|X}(z|X) \).

Coordinate Transformation

For equi-dimensional random vectors \( \mathbf{X} \) and \( \mathbf{Y} \) related by nondegenerate transformation \( \mathbf{Y} = \mathbf{g}(\mathbf{X}) \),

\[ f_{\mathbf{Y}}(\mathbf{y}) = \sum_{ \{ \mathbf{x} | \mathbf{g}(\mathbf{x}) = \mathbf{y} \} } \frac{ f_{\mathbf{X}}(\mathbf{x}) }{ \lvert J_{\mathbf{g}} (\mathbf{x}) \rvert } \]

Here \( J_{\mathbf{g}} (\mathbf{x}) \) is the Jacobian determinant of \( \mathbf{g}(\cdot) \) at \( \mathbf{x} \).

Theorem: (Linear/Affine Transformation)

If \( \mathbf{Y} = A \mathbf{X} + \mathbf{b} \), then

  1. \( f_{\mathbf{Y}} (\mathbf{y}) = \frac{ f_{\mathbf{X}}( A^{-1} (\mathbf{y} - \mathbf{b}) ) }{ \lvert \det(A) \rvert } \)
  2. \( \Phi_{\mathbf{Y}} (\mathbf{v}) = e^{i \mathbf{v}^T \cdot \mathbf{b}} \Phi_X(A^T \mathbf{v}) \)

Theorem: (Polar Coordinate Transformation)

For polar coordinate transformation of r.v.'s from (X,Y) to (R,\(\Theta\)),

\[ f_{R,\Theta} (r,\theta) = r f_{X,Y} (r \cos\theta, r \sin\theta) \]

Decomposition

Infinite divisibility

In probability theory, a probability distribution is infinitely divisible if it can be expressed as the probability distribution of the sum of an arbitrary number of independent and identically distributed random variables.

Note:

  • Gaussian r.v. is infinitely divisible.

Cramer's decomposition theorem

If X and Y are independent real-valued random variables whose sum X + Y is a normal random variable, then both X and Y must be normal as well.

Note:

  • Proof seems nontrivial.
  • The result can be generated to any finite sum.

Special Examples (Properties of Common Probablistic Models)

Poisson Distribution

If X and Y are independent Poisson r.v.'s with parameter \( \theta_x \) and \( \theta_y \) respectively, then

\[ X + Y \sim Poisson(\theta_x + \theta_y) \]

Exponential Distribution

If X and Y are independent exponential r.v.'s with parameter \( \lambda_x \) and \( \lambda_y \) respectively, and \( \lambda_x > \lambda_y \). Then

\[ X+Y \sim \frac{\lambda_x \lambda_y}{\lambda_x - \lambda_y} (e^{-\lambda_y z} - e^{-\lambda_x z}) \mathbf{1}(z>0) \]

Gaussian Distribution and Related

For independent standard Gaussian r.v.'s X and Y, \( \frac{X}{Y} \sim \text{Cauchy}(1) \)

Gamma Distribution

If \( X_1 \sim \Gamma(\alpha_1,\beta) \), \( X_2 \sim \Gamma(\alpha_2,\beta) \), and \( X_1 ∐ X_2 \), then \( X_1 + X_2 \sim \Gamma(\alpha_1+\alpha_2,\beta) \)

If \( Z \sim N(0,1) \), then \( Z^2 \sim \Gamma(\frac{1}{2},2) \), which is also \( \chi^2(1) \).

If \( Z_1, \cdots, Z_n \text{i.i.d.} N(0,1) \), then \( Z_1^2 + \cdots + Z_n^2 \sim \Gamma(\frac{n}{2},2) \), which is also \( \chi^2(n) \).

Beta Distribution

If \( X \sim B(\alpha,\beta), Y \sim B(\alpha+\beta, \gamma) \), and \( X ∐ Y \), then \( XY \sim B(\alpha, \beta + \gamma) \)

Uniform Order Statistics

If \( X_1, \cdots, X_{n+1} \text{i.i.d.} \text{Exponential}(1) \) and \( S_k = \sum_{i=1}^k X_i \), then \( \left( \frac{S_1}{S_{n+1}}, \cdots, \frac{S_n}{S_{n+1}} \right) \sim ( U_{(1)}, \cdots, U_{(n)} ) \), where \( U_i \text{i.i.d.} U(0,1) \).

Given the r-th uniform order statistic \( U_{(r)} \sim B(r,n+1-r) \) in a random sample of size n, then for \( b>a\), \( U_{(b)} - U_{(a)} \sim U_{(b-a)} \).

Click here for all the proofs.