Explained sum of squares

Explained sum of squares

In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example y_{i}=a+bx_{i}+epsilon_{i}), where y_{i} is the response variable, x_{i} is the explanatory variable, a and b are coefficients, i indexes the observations from 1 to n, and epsilon_{i} is the error term. In general, the less the ESS, the better the model performs in its estimation.

If hat{a} and hat{b} are the estimated coefficients, then

:hat{y_{i=hat{a}+hat{b}x_{i}

is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:

:sum_{i=1}^{n}left(hat{y}_{i}-ar{y} ight)^2.

In general: total sum of squares = explained sum of squares + residual sum of squares.

Type I SS

Type one estimates of the sum of squares explained by a model in a variable are obtained when sums of squares for a model are calculated sequentially (e.g. with the model "Y" = "aX"1 + "bX"2 + "cX"3). Sums of squares are calculated for a using the model "Y" = "aX"1 and sums of squares for "b" are calculated using the model "Y" = "aX"1 + "bX"2, and sums of squares for "c" are calculated using the model "Y" = "aX"1 + "bX"2 + "cX"3.

Type III SS

The type III sum of squares is calculated by comparing the full model, to the full model without the variable of interest. So it is considered to be the additional variability explained by adding the variable of interest. It is the same as the Type I ss when the variable is the last variable in the model.

ee also

*Sum of squares


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Sum of squares — is a concept that permeates much of inferential statistics and descriptive statistics. More properly, it is the sum of the squared deviations . Mathematically, it is an unscaled, or unadjusted measure of dispersion (also called variability). When …   Wikipedia

  • Total sum of squares — The value of the total sum of squares (TSS) depends on the data being analyzed and the test that is being done.In statistical linear models, (particularly in standard regression models), the TSS is the sum of the squares of the difference of the… …   Wikipedia

  • Residual sum of squares — In statistics, the residual sum of squares (RSS) is the sum of squares of residuals. It is the discrepancy between the data and our estimation model. The smaller this discrepancy is, the better the estimation will be.:RSS = sum {i=1}^n (y i f(x… …   Wikipedia

  • Residual Sum Of Squares - RSS — A statistical technique used to measure the amount of variance in a data set that is not explained by the regression model. The residual sum of squares is a measure of the amount of error remaining between the regression function and the data set …   Investment dictionary

  • Ordinary least squares — This article is about the statistical properties of unweighted linear regression analysis. For more general regression analysis, see regression analysis. For linear regression on a single variable, see simple linear regression. For the… …   Wikipedia

  • Linear least squares (mathematics) — This article is about the mathematics that underlie curve fitting using linear least squares. For statistical regression analysis using least squares, see linear regression. For linear regression on a single variable, see simple linear regression …   Wikipedia

  • Coefficient of determination — In statistics, the coefficient of determination R2 is used in the context of statistical models whose main purpose is the prediction of future outcomes on the basis of other related information. It is the proportion of variability in a data set… …   Wikipedia

  • Errors and residuals in statistics — For other senses of the word residual , see Residual. In statistics and optimization, statistical errors and residuals are two closely related and easily confused measures of the deviation of a sample from its theoretical value . The error of a… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Outline of regression analysis — In statistics, regression analysis includes any technique for learning about the relationship between one or more dependent variables Y and one or more independent variables X. The following outline is an overview and guide to the variety of… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”