The normalized covariance is reported for each pair of parameters, and quantifies the degree to which those two parameters are intertwined. Check a check box on the Diagnostics tab of nonlinear regression to view this covariance matrix.
Each value in the normalized covariance matrix ranges from -1.0 to 1.0. A value equal to -1.0 or 1.0 means the two parameters are redundant. A value of 0.0 means the parameters are completely independent or orthogonal -- if you change the value of one parameter you will make the fit worse and changing the value of the other parameter can't make it better. You can interpret a normalized covariance much as you interpret a correlation coefficient.
Note the difference between covariance and dependency. Each value in the covariance matrix tells you how much two parameters are intertwined. In contrast, each dependency value tells you how much that parameter is intertwined with all other parameters.
Some other programs report the actual (not normalized) variance-covariance matrix. Compute the actual covariance -- cov(i,j) -- of any two parameters (so i does not equal j) from the normalized matrix Prism reports -- NormCov(i,j) -- and the standard errors of the parameters using this equation:
Cov(i, j) = NormCov(i, j) * SE(i) * SE(j)
Prism does not report the normalized covariance matrix for a parameter with itself, because the normalized covariance of any parameter with itself equals, by definition, 1.0. The covariance of any parameter with itself is better called its variance. You can calculate the variance of any parameter (a diagonal value in the variance-covariance matrix) using this equation:
Cov(i, i) = SE(i)2