Sufficiency Principle

A statistic is sufficient to a statistical model and its associated unknown parameter, if no other statistic that can be calculated from the same sample provides any additional information as to the value of the parameter. [@Fisher1922]

Notes on Sufficiency principle

Conditionality Principle

Informally, the conditionality principle can be taken as the claim that experiments which were not actually performed are statistically irrelevant.

Conditionality Principle: If $E$ is any experiment having the form of a mixture of component experiments $E_h$, then for each outcome $(E_h, x_h)$ of $E$, [...] the evidential meaning of any outcome $x$ of any mixture experiment $E$ is the same as that of the corresponding outcome $x_h$ of the corresponding component experiment $E_h$, ignoring the over-all structure of the mixed experiment. [@Birnbaum1962]

Likelihood Principle

Although the relevance of the proof to data analysis remains controversial among statisticians, many Bayesians and likelihoodists consider the likelihood principle foundational for statistical inference.


🏷 Category=Statistics