The idea of likelihood is common to all statistical inference, and is well understood by frequentist and Bayesian statisticians alike.
The relationship between the parameters of a model and the
observables is fundamental to the process of updating knowledge of
parameters based upon the data. The likelihood is sometimes termed
the model, and takes the form of a probability statement , where X are the observable data in the system.
Note that the likelihood is a conditional probability statement as
to how likely it is for X to be observed if the parameters take
the value . In a statistical analysis, it is the
knowledge of
which is of interest, that is to say, the
distribution of
given that X is observed. This is
termed the posterior, and is dealt with below.
Other methods of inference concentrate on the likelihood in their
analysis, in which case the focus is as a
function of
for fixed X. Of course while
the same is not true of the integral with
respect to
. For this reason, and to avoid confusion,
the likelihood is sometimes written
.