Definition of least squaresEdit
The comparison method called Least squares is based on the fact that differences can be positive or negative, but that the square of a real negative number is always positive. Therefore, if there are several differences, the sum of all the squares of these differences is an indication of how close items are, and the best "fit" is the one with the least sum of the squares of the differences, with this method of comparison being called Least squares.
Least Squares EstimationEdit
For example an experiment resulted in values 2,5,4 at times 0, 1, 2. Let x be the time, and y be the actual results obtained. First we will compare the y with the values, at the times x, of a straight line; call these values y1 given by y1=2+x, being 2,3 and 4
x y | y1 y2 y3 0 2 | 2 2.7 2 1 5 | 3 3.7 5 2 4 | 4 4.7 4
At time x=0 the difference d=y-y1=0, its square d2=0, and the sum of the d2 so far is 0 At time x=1 the difference d=y-y1=2, its square d2=4, and the sum of the d2 so far is 4 At time x=2 the difference d=y-y1=0, its square is 0, and the sum of all the d2 is 4.
Trying another straight line, y2=2.7+x gives differences 2-2.7=-0.7, squared 0.49; next is 5-3.7=1.3, squared 1.69, and 4-4.7=-0.7, squared 0.49. The total of the squares is 0.49+1.69+0.49=2.67, being less than 4 it is the least squares so far.
Now let us try a quadratic equation, y3=2+5x-2x(squared), with its values shown above. The sum of the squares of the differences here is 0, being the least of the squares, and therefore the best fit.