Bayes' rule is about combining information to previous beliefs:
Posterior ∝ Prior * Likelihood
The state of the world and our information about it is continually changing, so we need to continuously update our beliefs; Bayes filter (aka recursive Bayesian estimation) formalizes how such a process may work. The filter uses knowledge about the dynamics of the world to convert its belief about the state of the world at the previous instance of time into a belief at a future point in time. {Kording2007}
The predict-update procedure:
Belief(t1) -> Uncertainty Propagation -> Belief(t2) (Prior) -> Bayesian Update -> Belief(t2) (Posterior)
Particle filter, aka sequential Monte Carlo (SMC), is a set of genetic Monte Carlo algorithms. mutation-selection sampling; no assumption on probability distribution; Such Monte Carlo techniques are computationally expensive.
Kalman filter (KF), aka linear quadratic estimation (LQE), is a Bayes filter with linear dynamics and Gaussian distributions, which has many analytical results. {Aggarwal2013} The Kalman Filter is the optimal estimate for linear system models with additive independent white noise in both the transition and the measurement systems.
Extended Kalman filter (EKF) extends Kalman filter to nonlinear differentiable dynamic systems by linearization at each time step. EKF is arguably the de facto standard in navigation systems and GPS. {Julier2004}
Unscented Kalman filter (UKF) is another nonlinear Kalman filter which transforms a sample of Gaussian prior and estimates the moments.
Ensemble Kalman filter (EnKF) is a Monte Carlo approximation of the Kalman filter which can handle high-dimensional state vectors. EnKF is often applied to geophysical models.