Please enable JavaScript to view this site.

The objective of multiple regression is to fit the chosen model to the entered data in order to determine values for the model parameters. The values determined for the parameters are those that will make the predictions of the model come as close as possible to the actual (entered) data. Prism offers three forms of multiple regression.

The first of these three forms is multiple linear regression. Like simple linear regression, multiple linear regression finds the values of the parameters (regression coefficients) in the model that minimize the sum of the squares of the discrepancies between the actual and predicted Y values. In other words, multiple linear regression is a least-squares method. The advantage of this method is that the parameter estimates can be determined using fairly simple calculations. Unlike Poisson, logistic and nonlinear regression, multiple linear regression does not require an iterative approach so does not require initial estimated values for the parameters.

The other two forms of multiple regression that Prism offers are Poisson regression and logistic regression. In both of these cases, parameter estimates are determined by maximizing the likelihood function. Unlike with multiple linear regression, this cannot be achieved simply by using the least-squares method. Instead, an iterative approach is utilized instead to determine the parameter estimates that would be most likely to randomly create the observed data given the model. But unlike nonlinear regression, you don't need to specify initial values (or review the suggested values) and consider the possibility that the reported fit is actually a false minimum. That is a potential problem with nonlinear regression, but not with multiple Poisson or logistic regression.

 

 

© 1995-2019 GraphPad Software, LLC. All rights reserved.