Advantages of using Prism's nonlinear regression analysis to fit straight lines
Linear regression is just a simpler, special, case of nonlinear regression. The calculations are a bit easier (but that only matters to programmers). You can use Prism's nonlinear regression analysis to fit a straight-line model, and the results will be identical to linear regression.
Here are some options that Prism offers with nonlinear regression, but not linear regression
- Fit to both a linear and nonlinear model, and compare the two models.
- Apply differential weighting.
- Identify, and possibly exclude, outliers.
- Use a robust fitting method.
- Perform a normality test on the residuals.
- Inspect the correlation matrix and the dependency of each parameter.
- Compare the scatter of points from the line with the scatter among replicates with the replicates test.
- Enter data as mean, SD (or SEM) and n, and have Prism take the SD and n into account when fitting the line. With linear regression, Prism only fits the means and ignores the values you entered for SD (or SEM) and n.
- Segmental linear regression.
- Test whether the best-fit value of the slope differs significantly from 1.0 (or any other value).
- Report the best-fit values with 90% confidence limits (or any others). Linear regression only reports 95% CI; nonlinear lets you choose the confidence level you want.
- Report the results of interpolation from the line/curve along with 95% confidence intervals of the predicted values.
- With linear regression, the SE of the slope is always reported with the slop as a plus minus value. With nonlinear regression, the SE values are a separate block of results that can be copy and pasted elsewhere.
- Use global nonlinear regression to fit one line to several data sets.