Logistic Regression in Machine Learning

Logistic Regression

Logistic Regression is a classification algorithm that’s use wherever the response variable is categorical. The concept of logistic Regression is to search out a relationship between options and likelihood of specific outcome.

E.g. After we have to predict if a student passes or fails in associate degree communication once the quantity of hours spend finding out is provide as a feature. so, the response variable has 2 values, pass and fail.

If the chance is quite five hundredth, it assigns the worth therein specific category else if the chance is a smaller amount than five hundredth.

thus, the worth is allot to the opposite category. Therefore, we are able to say that supplying regression acts as a binary classifier.

Working of a supplying Model

For rectilinear regression, the model is outline by:  y = ax + b – (i)

and for supplying regression, we have a tendency to calculate chance, i.e. y is that the chance of a given variable x happiness to an explicit category. Thus, it’s obvious that the worth of y ought to lie between zero and one.

But, after we use equation(i) to calculate chance, we’d get values but zero furthermore as larger than one. That doesn’t create any sense . So, we’d like to use such an associate degree equation that continually offers values between zero and one, as we have a tendency to want whereas calculating the chance.

So here we have a tendency to Use Sigmoid perform

Sigmoid Function:

The sigmoid perform could be a function having a characteristic “S” — formed curve, that transforms the values between the vary zero and one. The sigmoid perform conjointly refer as the sigmoid curve or supplying perform. it’s one among the foremost widely in use non- linear activation functions.

We use the sigmoid perform because the underlying perform in supplying regression. Mathematically and diagrammatically, it’s in the figure:

Why can we use the Sigmoid Function?

  1. The sigmoid functions are finite between zero and one. therefore it’s helpful in calculating the chance for the supplying performance.
  2. it’s by-product is straightforward to calculate than different functions that are helpful throughout gradient descent calculation.
  3. it’s an easy means of introducing non-linearity to the model.

Logit Function

Logistic regression will be expressed as:

where, left hand side is termed the logit or log-odds perform, and p(x)/(1-p(x)) is term odds.

The odds signifies the quantitative relation of chance of success to chance of failure. Therefore, in supplying Regression, linear combinations of inputs are map to the log(odds) – the output being adequate one.

The cost perform for the entire coaching set is given as :

Important Assumptions 

There are four major assumptions to think about before victimisation supplying Regression for modelling. These are:

The dependent/response/target variable should be binary or dichotomous: a knowledge purpose should work in just one of the 2 classes. as an example, to predict whether or not someone has neoplasm — Yes(1), No (0).

LACK of Multicollinearity: Independent/predictor variables should have very little to no collinearity amongst them that means that they ought to be freelance of every different. Variance Inflation issue (VIF) is one among the straightforward tests that may be wont to check for multicollinearity. If the VIF score for an element is on top of five, it’s higher to get rid of one among the related  freelance variables to cut back redundancy.

LARGE sample size: like any applied mathematics model, past information is the key for a strong model. Similarly, the larger the sample size, the higher and additional reliable is the results of the supplying multivariate analysis.

Log-odds Relationship: freelance variables should be linearly associated with log-odds.

Linear VS supplying Regression:

In rectilinear regression, the end result is often a variety of continuous prediction. As an example, given what you eat daily, your exercise habit and past weight changes, we are able to use rectilinear regression to predict your potential weights changes within the future. If the tip result variety is negative, it’s negatively related  which implies that you simply are probably to lose weight within the future. If the quantity is positive, then it’s completely related  and also the opposite holds true.

The second style of regression is supplying regression. In machine learning, we have a tendency to use supplying regression to assist classify between objects/items. If we would like to predict orange belongs with fruits or vegetables (I grasp, silly example) then we’d use supplying regression. It compares however a similar associate degree object is to this information at hand and decides that class to put it into.

written by: Mohd Zaid

reviewed by: Krishna Heroor

If you are Interested In Machine Learning You Can Check Machine Learning Internship Program
Also Check Other Technical And Non Technical Internship Programs

If you are Interested In Machine Learning You Can Check Machine Learning Internship Program
Also Check Other Technical And Non Technical Internship Programs

Leave a Comment

Your email address will not be published. Required fields are marked *