Logistic Models

Log-likelihood describes the log of the chances that an event would have yielded a specific outcome. It varies from 0 to minus infinity (i.e., minus because the log of any number less than 1 is negative). Probability refers to future outcomes, while Likelihood refers to past ones. LL is thus a measure of variation, sometimes also referred to as uncertainty.

The negative log-likelihood is the negative log of the probability of an observed response. Minimizing the negative of a log-likelihood function thus produces maximum likelihood estimates for a particular effect.

The ratio refers to the maximum value of the likelihood function under the constraint of the null hypothesis compared to the maximum without that constraint.

First, the negative log-likelihood (i.e., uncertainty) is calculated for the case where no model is assumed (e.g., the probabilities are estimated at equal and fixed background rates). Then the negative log-likelihood (or uncertainty) is calculated after fitting the model.

The difference of these two negative log-likelihoods is the reduction due to fitting the model. Two times this value is the likelihood-ratio Chi-square test statistic. An advantage of the log-likelihood ratios is that log-likelihood terms are additive (see replicated goodness of fit tests).

An advantage of the log-likelihood ratios is that log-likelihood terms are additive (see replicated goodness of fit tests). Log-likelihood ratio can be used to assesses Goodness of Fit (G-test), similar to Chi2 tests. The latter in essence is nothing more than an approximation of log-likelihoods for instances when the calculation of LL was considered too laborious.

Χ2 =Σ [(observed - expected)2/expected]

G = -2Σ[observed * ln(observed/expected)]

The ratio refers to the maximum value of the likelihood function under the constraint of the null hypothesis to the maximum without that constraint.


last modified: 3/18/08