What are the different types of regression that are available for the individuals to understand their concepts?

Have you ever wanted to know how those people at the Investment banks know which shares are going to rise by how much and by when?

Or did you ever wanted to know that how the corporates determine the prices that make their product sales and profit optimum?

Then you are in the right place, it is not magic but it is just regression analysis.

Regression analysis is a set of statistics tools and processes that allow the analyst to devise an estimated mathematical relationship between the causal effects and the end result.

- Generally, this technique is used for forecasting and also used in causal effect relationship.
- This process is often used to create a mathematical formula which includes all the causal variables and the corresponding proportionality constant which combines to give a formula for the end product. This end product is then used to forecast different scenarios.
- This shows a significant relationship between the dependent/end result variable and the independent/causal variable.

As we have now understood what Regression Analysis is, let us spend some time and go through different types of regressions that are available.

There is majorly seven kinds of regression, but many other hybrids have been developed to match the requirements of the user. The seven kind of regression is as follows:

- 1. Linear Regression
- 2. Logistic Regression
- 3. Polynominal Regression
- 4. Stepwise Regression
- 5. Ridge Regression
- 6. Lasso Regression
- 7. ElasticNet Regression

This is the most common and easiest of the regression process.

The Linear Regression is used to develop a relationship between an independent/causal variable and the dependent/result variable by fitting the pattern into the best fit. The best fit straight line is also known as the regression line.

- It is often represented as Y=a+b*X + e
- Where Y is the dependent variable,
- a is the minimum value of Y at X=0,
- b is the proportionality constant of X,

Which is the dependent variable and e is the variable for other lurking/ unknown factors.

It is used to predict the probability of an event where the result is binary that is either yes or no.

A logistic Regression is a complex process and thus requires a larger sample size and needs to avoid correlated dependent variables.

A regression where the dependent variable is a function of a polynomial function of independent variables is said to be a polynomial regression.

This is a regression which is used to determine the formula for a dependent variable which is affected by different factors in different parts of its life cycle.

Ridge regression is used in case of highly correlated multi-independent factor related dependent variables. It is used to undermine the impact of each factor on the other.

While a Lasso regression is very similar to ridge regression but unlike the latter case, it does not require the data to be normal.

ElasticNet regression is used when there are more than one dominant independent variables amidst a list of many correlated independent variables.

Along with these seasonality & time value factors are often used to determine the type of regression.

**Example:**

A study of the impact of height on body weight was done and a relation was found to be existing between the estimated weights of a person after a certain minimum height. The correlation was found to be positive, just as logic suggests. It is fitted in a linear regression which works on the idea of minimum sum of squares of standard deviation. The results can be found as bellow-

So in this article, we talked about all the different types of regressions that are available for us to study and be aware of their concepts. I hope you have enjoyed reading this article, if you think any of the topics would be beneficial to add the above, please let us know your suggestions through the comments section.

Free Demo for Corporate & Online Trainings.