Skowhegan Area High School
 Topics Introduction Spearman's Rank Correlation Coefficient Scatterplot Matrix Rank Correlation Matrix Pearson's Correlation Coefficient Influential Point Lurking Variable Cause and Effect Least Squares Regression ("r" and residuals)

Least Squares Regression

"r" and residuals

Explanation

Least squares fitting tries to get every point in a set of data as equally close to the line as possible. The way to do this is to make sure that no point is far from the average distance away from the line. In statistical terms this means you are trying to minimize the standard deviation of the distances of the points from the line. We use squares of numbers to figure out the standard deviation. If a point is really far away from the line that really increases the standard deviation since the distance away from the line is squared which makes its value much bigger. On the other hand if the distance between the point and the line is small, then squaring it makes it even smaller. (.5^2 = .25) So we get this effect of really disliking points far away from the line and really liking points close to the line. This creates a fit that gives each point close to an average distance away from the line. Another characteristic of the least squares line is that the line passes through the centroid, . Thank you Dr. Math.

How to use it

To use the least squares regression enter data in to your L1 and L2. Once your data has been entered you will want to press the stat button and scroll over to the calc. Once you are in the calc scroll down until you find the LinReg and then press enter. Press 2nd 1, comma, 2nd 2, and then press enter. The next screen will be the best fit line. The r value will tell you how accurate the line is. The closer to 1 or -1 the more accurate the line.

Example

 L1 L2 L3 1 1 0 2 3 1 3 5 2 4 7 9 5 8 11

Toyota Spyder