Description
Linear Regression a.k.a. Ordinary Least Squares is one of the easiest and most widely used ML algorithms. But it suffers from a fatal flaw — it is super easy for the algorithm to overfit the training…
Summary
- You might as well ditch Linear Regression Problems of Linear Regression Linear Regression a.k.a.
- With large coefficients, it is easy to predict nearly everything — you just take the relevant combination of individual slopes (βs) and you get the answer.
- Ideally, the perfect model would have low bias and low variance but that is easier said than done.
- Apart from OLS (the first part), ridge regression squares every individual slope of the feature variables and scales them by some number 𝜆.
- This is called the Ridge Regression penalty.