Residual sum of squares (2023)

What is Residual Sum of Squares?

Residual Sum of Squares (RSS) is a statistical method that helps identify the level of discrepancy in a dataset not predicted by a regression model. Thus, it measures the variance in the value of the observed data when compared to its predicted value as per the regression model. Hence, RSS indicates whether the regression model fits the actual dataset well or not.

Residual sum of squares (1)

You are free to use this image on your website, templates, etc., Please provide us with an attribution linkHow to Provide Attribution?Article Link to be Hyperlinked
For eg:
Source: Residual sum of squares (

Also referred to as the Sum of Squared Errors (SSE), RSS is obtained by adding the square of residuals. Residuals are projected deviations from actual data values and represent errors in the regression Regression read moremodel’s estimation. A lower RSS indicates that the regression model fits the data well and has minimal data variation. In financeFinanceFinance is a broad term that essentially refers to money management or channeling money for various more, investors use RSS to track the changes in the prices of a stock to predict its future price movements.

Table of contents
  • What is Residual Sum of Squares?
    • Residual Sum of Squares Explained
    • Residual Sum of Squares in Finance
    • Formula
    • Calculation Example
    • Frequently Asked Questions (FAQs)
    • Recommended Articles

Key Takeaways

  • Residual Sum of Squares (RSS) is a statistical method used to measure the deviation in a dataset unexplained by the regression model.
  • Residual or error is the difference between the observation’s actual and predicted value.
  • If the RSS value is low, it means the data fits the estimation model well, indicating the least variance. If it is zero, the model fits perfectly withthe data, having no variance at all.
  • It helps stock market players to assess the future stock price movements by monitoring the fluctuation in the stock prices.
(Video) Residual Sum of Squares (RSS) | Simple Linear Regression

Residual Sum of Squares Explained

RSS is one of the types of the Sum of Squares (SS) –the rest two being the Total Sum of Squares (TSS) and Sum of Squares due to Regression (SSR) or Explained Sum of Squares (ESS). Sum of squares is a statistical measure through which the data dispersion Dispersion In statistics, dispersion (or spread) is a means of describing the extent of distribution of data around a central value or point. It aids in understanding data moreis assessed to determine how well the data would fit the model in regression analysis.

While the TSS measures the variation in values of an observed variable with respect to its sample mean, the SSR or ESS calculates the deviation between the estimated value and the mean value of the observed variable. If the TSS equals SSR, it means the regression model is a perfect fit for the data as it reflects all the variability in the actual data.

Residual sum of squares (2)

You are free to use this image on your website, templates, etc., Please provide us with an attribution linkHow to Provide Attribution?Article Link to be Hyperlinked
For eg:
Source: Residual sum of squares (

On the other hand, RSS measures the extent of variability of observed data not shown by a regression model. To calculate RSS, first find the model’s level of error or residue by subtracting the actual observed values from the estimated values. Then, square and add all error values to arrive at RSS.

The lower the error in the model, the better the regression prediction. In other words, a lower RSS signifies that the regression model explains the data better, indicating the least variance. It means the model fits the data well. Likewise, if the value comes to zero, it’s considered the best fit with no variance.

Note that the RSS is not similar to R-SquaredR-SquaredR-squared ( R2 or Coefficient of Determination) is a statistical measure that indicates the extent of variation in a dependent variable due to an independent more. While the former defines the exact amount of variation, R-squared is the amount of variation defined with respect to the proportion of total variation.

(Video) What is Ressidual Sum of Squares(RSS) in Regression (Machine Learning)

Residual Sum of Squares in Finance

The discrepancy detected in the data set through RSS indicates whether the data is a fit or misfit to the regression model. Thus, it helps stock marketStock MarketStock Market works on the basic principle of matching supply and demand through an auction process where investors are willing to pay a certain amount for an asset, and they are willing to sell off something they have at a specific more players to understand the fluctuation occurring in the asset prices, letting them assess their future price movements.

Regression functions are formed to predict the movement of stock prices. But the benefit of these regression models depends on whether they well explain the variance in stock prices. However, if there are errors or residuals in the model unexplained by regression, then the model may not be useful in predicting future stock movements.

As a result, the investors and money managers get an opportunity to make the best and most well-informed decisions using RSS. In addition, RSS also lets policymakers analyze various variables affecting the economic stability of a nation and frame the economic models accordingly.


Here is the formula to calculate the residual sum of squares:

Residual sum of squares (3)


Residual sum of squares (4)

Calculation Example

Let’s consider the following residual sum of squares example based on the set of data below:

(Video) Introduction to residuals and least squares regression

Residual sum of squares (5)
Residual sum of squares (6)

The absolute variance can be easily found out by implementing the above RSS formula:

Residual sum of squares (7)

= {1 – [1+(2*0)]}2 + {2 – [1+(2*1)]}2 + {6 – [1+(2*2)]}2 + {8 – [1+(2*3)]}2

= 0+1+1+1 = 3

Residual sum of squares (8)

Frequently Asked Questions (FAQs)

What is Residual Sum of Squares (RSS)?

(Video) How to Calculate Residual Sum of Squares

RSS is a statistical method used to detect the level of discrepancy in a dataset not revealed by regression. If the residual sum of squares results in a lower figure, it signifies that the regression model explains the data better than when the result is higher. In fact, if its value is zero, it’s regarded as the best fit with no error at all.

What is the difference between ESS and RSS?

ESS stands for Explained Sum of Squares, which marks the variation in the data explained by the regression model. On the other hand, Residual Sum of Squares (RSS) defines the variations marked by the discrepancies in the dataset not explained by the estimation model.

How do TSS and RSS differ?

(Video) 9.1) Minimize the Residual Sum of Squares (RSS)

The Total Sum of Squares (TSS) defines the variations in the observed values or datasets from the mean. In contrast, the Residual Sum of Squares (RSS) assesses the errors or discrepancies in the observed data and the modeled data.

Recommended Articles

This has been a guide to what is Residual Sum of Squares. Here we explain how to calculate residual sum of squares in regression with its formula & example. You can learn more about it from the following articles –

  • Least Squares RegressionLeast Squares RegressionVBA square root is an excel math/trig function that returns the entered number's square root. The terminology used for this square root function is SQRT. For instance, the user can determine the square root of 70 as 8.366602 using this VBA more
  • Gradient BoostingGradient BoostingGradient Boosting is a system of machine learning boosting, representing a decision tree for large and complex data. It relies on the presumption that the next possible model will minimize the gross prediction error if combined with the previous set of more
  • Regression LineRegression LineA regression line indicates a linear relationship between the dependent variables on the y-axis and the independent variables on the x-axis. The correlation is established by analyzing the data pattern formed by the more


1. The Main Ideas of Fitting a Line to Data (The Main Ideas of Least Squares and Linear Regression.)
(StatQuest with Josh Starmer)
2. Sum of squares
(Ben Lambert)
3. Decomposition of Variability: Sum of Squares | Statistics Tutorial
(365 Data Science)
4. Excel 2010: Sum of the Squared Residuals
(David Hays)
5. Sum of the residuals squared TI-83 or Ti-84
(Scott McDaniel)
6. FRM: Residual sum of squares (RSS)
(Bionic Turtle)
Top Articles
Latest Posts
Article information

Author: Kimberely Baumbach CPA

Last Updated: 04/19/2023

Views: 5627

Rating: 4 / 5 (41 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Kimberely Baumbach CPA

Birthday: 1996-01-14

Address: 8381 Boyce Course, Imeldachester, ND 74681

Phone: +3571286597580

Job: Product Banking Analyst

Hobby: Cosplaying, Inline skating, Amateur radio, Baton twirling, Mountaineering, Flying, Archery

Introduction: My name is Kimberely Baumbach CPA, I am a gorgeous, bright, charming, encouraging, zealous, lively, good person who loves writing and wants to share my knowledge and understanding with you.