Mean Square Error (MSE) and Mean Absolute Error (MAE) — Regression
Regression is a technique to establish a relationship between independent and dependent variables. The relationship assumes the linearity between them. It is very useful technique to predict the future value of dependent variable. Simple example of dependent variable (y) and independent variable (x) is as under
The values of weights decide the values of y based on value of x. For different weights, value of y maybe different with same x. Let’s understand with example and we represent this y as ybar
Next question came to our mind is, how to decide the values of weights. Although there are multiple methods available, but best is to use optimization algorithm called Gradient Descent.
Gradient Descent tracks the error calculated between the y and ybar and update the weights until the error is in acceptable level or sufficient iterations are complete. There are two very interesting techniques to calculate errors are mean square error (MSE) and mean absolute error (MAE). Let’s look into both techniques
- Mean Square Error (MSE) — It calculates the mean of square of difference between y and ybar. The equation is defined below
Let’s calculate MSE for an example
Observations:
a. More is the difference between y and ybar, more is the contribution towards MSE
b. Higher error contributions mean more penalization during optimization
c. Outliers have more impact
2. Mean Absolute Error (MAE) — It calculates mean of absolute difference between y and ybar. The equation is defined below
Let’s calculate MAE for an example
Observations:
a. Contribution of difference towards MAE is almost similar
b. Error contribution is almost uniform and hence similar penalization during penalization
c. Outliers have less affect
Next: Gradient Descent — Regression