(symbol: SEE) for a relationship between two variables (x and y) given by a regression equation, an index of how closely the predicted value of y for a specific value of x matches its actual value. If y’ is an estimated value from a regression line and y is the actual value, then the standard error of estimate is √[Σ(y – y’)2/n],
where n is the number of points. The smaller the standard error of estimate, the more confident one can be in the accuracy of the estimated (predicted) y value.