Theorem: (Monotone Transformation) r.v. \( X \sim F_X(x) \), and r.v. Y is related to X by \( Y=g(X) \).
Corollary: (Probability Integral Transformation)
The following methods assumes transformations of the form \( Z = g(X,Y) \), and can be applied to general multivariate transformations.
\[ F_Z (z) = P( g(X,Y) \leq z ) = P( (X,Y) \in G_z ) = \iint_{G_z} f_{X,Y}(x,y) \text{d}x \text{d}y \]
Here \( G_z = \{ (X,Y) | g(X,Y) \leq z \} \).
Theorem: (Summation of Independent R.V.'s) If r.v.'s \( X ∐ Y \) and \( Z = X+Y \), then
For equi-dimensional random vectors \( \mathbf{X} \) and \( \mathbf{Y} \) related by nondegenerate transformation \( \mathbf{Y} = \mathbf{g}(\mathbf{X}) \),
\[ f_{\mathbf{Y}}(\mathbf{y}) = \sum_{ \{ \mathbf{x} | \mathbf{g}(\mathbf{x}) = \mathbf{y} \} } \frac{ f_{\mathbf{X}}(\mathbf{x}) }{ \lvert J_{\mathbf{g}} (\mathbf{x}) \rvert } \]
Here \( J_{\mathbf{g}} (\mathbf{x}) \) is the Jacobian determinant of \( \mathbf{g}(\cdot) \) at \( \mathbf{x} \).
Theorem: (Linear/Affine Transformation)
If \( \mathbf{Y} = A \mathbf{X} + \mathbf{b} \), then
Theorem: (Polar Coordinate Transformation)
For polar coordinate transformation of r.v.'s from (X,Y) to (R,\(\Theta\)),
\[ f_{R,\Theta} (r,\theta) = r f_{X,Y} (r \cos\theta, r \sin\theta) \]
In probability theory, a probability distribution is infinitely divisible if it can be expressed as the probability distribution of the sum of an arbitrary number of independent and identically distributed random variables.
Note:
If X and Y are independent real-valued random variables whose sum X + Y is a normal random variable, then both X and Y must be normal as well.
Note:
If X and Y are independent Poisson r.v.'s with parameter \( \theta_x \) and \( \theta_y \) respectively, then
\[ X + Y \sim Poisson(\theta_x + \theta_y) \]
If X and Y are independent exponential r.v.'s with parameter \( \lambda_x \) and \( \lambda_y \) respectively, and \( \lambda_x > \lambda_y \). Then
\[ X+Y \sim \frac{\lambda_x \lambda_y}{\lambda_x - \lambda_y} (e^{-\lambda_y z} - e^{-\lambda_x z}) \mathbf{1}(z>0) \]
For independent standard Gaussian r.v.'s X and Y, \( \frac{X}{Y} \sim \text{Cauchy}(1) \)
If \( X_1 \sim \Gamma(\alpha_1,\beta) \), \( X_2 \sim \Gamma(\alpha_2,\beta) \), and \( X_1 ∐ X_2 \), then \( X_1 + X_2 \sim \Gamma(\alpha_1+\alpha_2,\beta) \)
If \( Z \sim N(0,1) \), then \( Z^2 \sim \Gamma(\frac{1}{2},2) \), which is also \( \chi^2(1) \).
If \( Z_1, \cdots, Z_n \text{i.i.d.} N(0,1) \), then \( Z_1^2 + \cdots + Z_n^2 \sim \Gamma(\frac{n}{2},2) \), which is also \( \chi^2(n) \).
If \( X \sim B(\alpha,\beta), Y \sim B(\alpha+\beta, \gamma) \), and \( X ∐ Y \), then \( XY \sim B(\alpha, \beta + \gamma) \)
If \( X_1, \cdots, X_{n+1} \text{i.i.d.} \text{Exponential}(1) \) and \( S_k = \sum_{i=1}^k X_i \), then \( \left( \frac{S_1}{S_{n+1}}, \cdots, \frac{S_n}{S_{n+1}} \right) \sim ( U_{(1)}, \cdots, U_{(n)} ) \), where \( U_i \text{i.i.d.} U(0,1) \).
Given the r-th uniform order statistic \( U_{(r)} \sim B(r,n+1-r) \) in a random sample of size n, then for \( b>a\), \( U_{(b)} - U_{(a)} \sim U_{(b-a)} \).
Click here for all the proofs.