The page below is a sample from the LabCE course Linear Regression Analysis. Access the complete course and earn ASCLS P.A.C.E.-approved continuing education credits by subscribing online.

How to Subscribe
 MLS & MLT Comprehensive CE PackageIncludes 123 CE courses, most popular \$95 Add to cart Pick Your CoursesUp to 8 CE hours \$50 Add to cart Individual course \$20 Add to cart

The Least Squares Line

According to the method of least squares, the line of best fit is the one that minimizes the squares of the differences between the data points' observed (experimental) y-values and their expected (theoretical) y-values. This line is known as the least squares regression line.

To calculate the sum of squares of a line, find the expected y value of each point by substituting the corresponding x value into the linear regression equation. Then, find the difference between the observed y value and the expected y value for each point. Finally, square the difference between the observed and expected y value for each point, and then sum those values. The lines that were shown on the previous page are calculated below:
 Point x Observed y Expected lines A; B; C Difference y-lines A; B; C Difference Squared (y-)2lines A; B; C 1 10 5.0 8.0; 10.0; 12 -3.0; -5.0; -7.0 9.0; 25.0; 49.0 2 18 24 14.4; 18; 22 9.6; 6.0; 2.0 92.16; 36.0; 4.0 3 38 27.5 30.4; 38; 45 -2.9; -10.5; -17.5 8.41; 110.25; 306.25 4 50 60.0 40.0; 50.0; 60.0 20.0; 10.0; 0 400.00; 100.0; 0 5 63 50.0 48.0; 63; 74 2.0; -13.0; -24.0 4.00; 169.0; 576
The total sum of squares for line A = 513.57; line B= 440.25; line C= 935.25. As said before, the line that minimizes this value is the line of best fit according to the least squares method. Therefore, line B is the best fit of these 3 lines.