Of interest to the modeller, then, is the conditional distribution
of the parameters, given the data, that is . Bayes
Theorem for random variables [27] yields
The distribution is termed the posterior distribution and describes
the current state of knowledge about
, given the initial knowledge of
, together with the model, such knowledge having been updated by
information. The constant of proportionality in the above is just
where p(X) can be obtained from
.
The Bayesian method, is then, quite straightforward [16]:
In practice these tasks can be difficult to implement, and more is said about the details later.