next up previous contents
Next: An Example Up: Formal Bayesian Methodology Previous: Prior Knowledge

Model or Likelihood

The idea of likelihood is common to all statistical inference, and is well understood by frequentist and Bayesian statisticians alike.

The relationship between the parameters of a model and the observables is fundamental to the process of updating knowledge of parameters based upon the data. The likelihood is sometimes termed the model, and takes the form of a probability statement tex2html_wrap_inline2051 , where X are the observable data in the system.

Note that the likelihood is a conditional probability statement as to how likely it is for X to be observed if the parameters take the value tex2html_wrap_inline2033 . In a statistical analysis, it is the knowledge of tex2html_wrap_inline2033 which is of interest, that is to say, the distribution of tex2html_wrap_inline2033 given that X is observed. This is termed the posterior, and is dealt with below.

Other methods of inference concentrate on the likelihood in their analysis, in which case the focus is tex2html_wrap_inline2051 as a function of tex2html_wrap_inline2033 for fixed X. Of course while tex2html_wrap_inline2069 the same is not true of the integral with respect to tex2html_wrap_inline2033 . For this reason, and to avoid confusion, the likelihood is sometimes written tex2html_wrap_inline2073 .



Cathal Walsh
Sat Jan 22 17:09:53 GMT 2000