Feature Selection

Feature selection is the core technique in machine learning algorithms which impacts directly on your model performance. Irrelevant features can impact negatively on your model. So with the help of the Feature selection technique with we can select the best and relevant features for our models

Benefits of performing feature selection technique:-

1. Reduction in overfitting.

2. Improve accuracy 

3. Reduces Training time.

Features selection Techniques:-

  1. Filter methods 
  2. Wrapper methods 
  3. Embedded methods 
  4. Hybrid methods 

1. Filter methods

These methods select intrinsic features from the data via univariate statics test .this test computationally less cheap than wrapper methods. Information Gain:- In these methods, we calculate the reduction in entropy

For the Information gain technique, we also use a Decision tree and Random forest. Chi-square Test:-The Chi-square technique used in those datasets which have categorical features. We calculate the chi-square test between features and target and select the best features with the best Chi-Square test. To calculate chi-square test datasets must be categorical, sampled, and independent.

 

Correlation Coefficient

Correlation Coefficient Correlation is the measure of the linear relationship between the features of datasets. Every feature of datasets should be highly correlated with the target but no feature should be correlated with other features. If one feature is correlated with another feature that’s mean they are having the same information so we can eliminate it

Variance Threshold:- In this technique, we however, calculate the variance between features if variance does not meet the given threshold it will remove those features.

2. Wrapper Methods:

This is thus a greedy approach for feature selection. so, this technique searches all possible combinations of the feature by evaluating them. The wrapper method gives better accuracy than the filter method.

The wrapper method has the following technique for feature selections. 

  1. Forward feature selection:- This is an iterative approach for selecting a feature. We begin with selecting the best feature with the target and then we select the next feature with the first feature and we iterate all features until we achieve a preset condition.
  2. Backward feature elimination:- This process is exactly the opposite to forward feature selection. however, In this process, we start all possible features. We iteratively remove all that feature accuracy is improve by removing those features.

3. Embedded Methods:

Embedded methods have the benefits of the filter and Wrapper method. however, The embedded method is an iterative approach. so, This method selects those features that contribute more to the training process.

LASSO Regularization (L1)

L1 consists of adding a penalty to the different parameters of the model to reduce the freedom of the model, i.e. to avoid over-fitting. In linear model regularization, the penalty is applied over the coefficients that multiply every one of the predictors. From the different types of regularization, Lasso or L1 has the property that is able to shrink some of the coefficients to zero. Therefore, that feature is often far from the model.

Written By: Amit Gupta

Reviewed By: Savya Sachi

If you are Interested In Machine Learning You Can Check Machine Learning Internship Program
Also Check Other Technical And Non Technical Internship Programs

1 thought on “Feature Selection”

  1. Pingback: Decomposition Of Signals - Pianalytix - Machine Learning

Leave a Comment

Your email address will not be published. Required fields are marked *