

The goal of nonlinear regression is to adjust the values of the model's parameters to find the curve that best predicts Y from X. More precisely, the goal of regression is to minimize the sum of the squares of the vertical distances of the points from the curve.
Why minimize the sum of the squares of the distances? Why not simply minimize the sum of the actual distances?
If the random scatter follows a Gaussian distribution, it is far more likely to have two medium size deviations (say 5 units each) than to have one small deviation (1 unit) and one large (9 units). A procedure that minimized the sum of the absolute value of the distances would have no preference over a curve that was 5 units away from two points and one that was 1 unit away from one point and 9 units from another. The sum of the distances (more precisely, the sum of the absolute value of the distances) is 10 units in each case. A procedure that minimizes the sum of the squares of the distances prefers to be 5 units away from two points (sumofsquares = 50) rather than 1 unit away from one point and 9 units away from another (sumofsquares = 82). If the scatter is Gaussian (or nearly so), the curve determined by minimizing the sumofsquares is most likely to be correct.