Please enable JavaScript to view this site.

Prism provides a number of options that it can report that provide information on how well the specified model compares to other models fit to the same data. Specifically, Prism will report values for Akaike’s corrected information criterion (AICc), the negative log likelihood value, and the model deviance for both the specified model as well as an intercept-only model. Each of these values are described briefly below.

Akaike’s corrected Information Criterion (AICc)

This value is derived from an information theory approach that attempts to determine how well the data fit the model. It depends both on the model deviance (described below) as well as the number of parameters in the model. Note that many other software packages will report simply AIC. This is an uncorrected form of Akaike’s Information Criterion, and has been shown to select models with too many parameters (i.e. it will overfit) when sample sizes are small. Below are the equations needed to calculate AIC and AICc, and how to convert between the two.

AIC = 2*k + Deviance

where k is the number of parameters in the model (reported by Prism in the Data summary section of the results)

AICc = AIC + [(2k2 + 2k)/(n – k – 1)]

where n is the sample size/number of observations (reported by Prism in the Data summary section of the results)

Note that the equation for AIC and AICc is a bit different for nonlinear regression. Nonlinear regression (and multiple linear regression) essentially fits the value of the sum of squares, so k in the equations above is replaced by k+1.

Negative log likelihood

Although the concept of likelihood is somewhat complex, this is simply another metric by which you can assess how well the specified model fits to the entered data. Mathematically, the log likelihood is:

Log likelihood = Σ(ln(Predicted probabilities for values entered as 1)) + Σ(ln(1 - Predicted probabilities for values entered as 0))

Because the predicted probabilities are all less than 1 (and 1 – predicted probabilities are also less than 1), the natural log of these values are all negative. Thus, the sum of all of these negative values is also negative. Therefore, we take the negative of this value to get a positive value for the “Negative log likelihood” (confusing, I know).

In general, when comparing two models fit to the same data, the model with the larger log likelihood (smaller negative log likelihood) is considered to be a better "fit".m

Model Deviance

This final metric that Prism reports can be used to assess how well a model fits to the data, and also uses the likelihood of the model. As seen above, deviance is also used directly in calculating AIC (and AICc) for logistic regression. Fortunately, once you’ve calculated the negative log likelihood, calculating the model deviance is simple:

Model deviance = 2*(negative log likelihood)

Sometimes you’ll see this as

Deviance = -2*ln(Likelihood), or even as

Deviance = -ln(Likelihood2)

This is also why deviance is also sometimes referred to as “G squared”

 

© 1995-2019 GraphPad Software, LLC. All rights reserved.