KNOWLEDGEBASE - ARTICLE #1399

Post tests following one-way ANOVA are incorrect when one column is entirely excluded.

 A bug in Prism 4 and 5 (fixed in 5.02 and 5.0b) results in incorrect post tests following one-way ANOVA when a column is entirely excluded. 

You can choose which columns (data sets) are included in the analysis from the Analyze dialog and using Change...Data analyzed. This works perfectly. Data sets that are deselected in this way do not affect the results. 

Prism lets you exclude values using a button and command on the Edit menu. Excluded values are shown in blue italics on the data table and are ignored by analyses and graphs. ANOVA works fine when selected values are excluded. The bug emerges when all values in a data set column are excluded.

The ANOVA calculations are correct. Prism sees that a column is empty and doesn't count it when figuring out degrees of freedom. So the P value and ANOVA table are correct. But when Prism does the calculations for post tests, it was confused by columns that were part of the analysis (selected in the Analyze dialog) but in fact had only excluded values.  Prism try to make phantom comparisons, and these are reported  with  obviously nonsense values. These confidence intervals are reported as ranging from "0.0 to 0.0". When Prism figures out how to correct for multiple comparisons, it counts these bogus comparisons in the total number of comparisons, so the other multiple comparisons results are incorrect.  

The work around is to never include a column that is entirely excluded in a one-way ANOVA analysis. Use Change..Data analyzed and remove that data set from the analysis. 

This bug is fixed in Prism 5.02 and 5.0b. 

Explore the Knowledgebase

Analyze, graph and present your scientific work easily with GraphPad Prism. No coding required.