Regression Metrics

Model Evaluation is important in data science projects. It especially helps to identify the performance of your model. With the Evaluation of Your Model, we Can Measure How our Model Is Going to Give Results. There are different types of evaluation metrics out there but for the regression model, only few are there.

What is Regression?

In regression, the model will learn and predict numeric values. Example: Predicting the price of a stock on future days due to given past price history and other information about the company and the market.

What are the Metrics?

in Metrics, we choose it to evaluate our Machine Learning algorithms.

Metrics will tell us how we can weigh the importance of our features and its result.

In addition, Metrics are Used For Both Classification and Regression Tasks.

There are some Most common ways for evaluating predictions for regression Metrics task:

1.  Mean-Absolute-Error(MAE)

2. Mean Squared Error (MSE)

3. R² Metric

4. Root Mean Square Error (RMSE)

5. Adjusted R²

1. Mean Absolute Error

It is the Average of the absolute difference between prediction and actual values.

However, It only gives you an idea of how wrong predictions were.

Mean Absolute Error is Also More Robust to Outliers .

Formula for Mean Absolute Error

With Example :

With above we just import data and split into x and y and apply k fold selection to shuffle data and feed to the linear regression model then we find our cross val score and based on the score we find it mean and std deviation.

 A value of 0 indicates no error or perfect predictions.

2. Mean Square Error

Mean Square Error is same as Mean Absolute Error . Taking the Square root of the mean square error and converting the unit back to the original value of the output variable.

It is used more than other metrics because it is differentiable and hence can be optimized better.

Rmse is useful when large errors are unfavourable.

Formula for mean square error

With Example:-

MSE: -34.705 (45.574)

With above we just import data and split into x and y and apply k fold selection to shuffle data and feed to the linear regression model then we find our cross val score and based on the score we find it mean and std deviation.Here results are increasing.

3. R² Metric

It is a metric that provides an indication of the goodness of fit of prediction to the actual values. We also know that in the name of coefficient of determination it evaluates the performance of a regression model .

as a result, Its values varies between 0 and 1 for No-Fitted  or Perfect Fitted.

formula for calculating r2

With Example :-

With above we just import data and split into x and y and apply k fold selection to shuffle data and feed to the linear regression model then we find our cross Val score and based on the score we find it means and std deviation.

You can see that the predictions have a poor fit to the actual values with a value close to zero and less than 0.5.

4. Root Mean Squared Error

RMSE is used for regression tasks and is the square root of the averaged squared difference between the target value and also the value predicted by the model. 

It is a Good Measure of Accuracy But not for Compare Predictions.

formula for Root Mean Squared Error

5. Adjusted R²

It deals with additional Independent variables. Also, This r squared value penalizes the value of the r squared if the choice of independent variable were not good.

Adjusted R² is always smaller than R² .

formula for adjusted r2

Conclusion| Regression Metrics:

With Regression metrics we understand :

  • Therefore, these Methods determine an algorithm’s performance and behaviour.
  • Are Consequently Helpful to parameterize the model in such a way that can offer the best performing algorithm.

Article by: Chandra Prakash Mewari

If you are Interested In Machine Learning You Can Check Machine Learning Internship Program
Also Check Other Technical And Non Technical Internship Programs

Leave a Comment

Your email address will not be published. Required fields are marked *