Prevent statistical overfitting with regularization techniques
Regularization techniques are used to prevent statistical overfitting in a predictive model. By introducing additional information into the model, regularization algorithms can deal with multicollinearity and redundant predictors by making the model more parsimonious and accurate. These algorithms typically work by applying a penalty for complexity such as by adding the coefficients of the model into the minimization or including a roughness penalty.
Techniques and algorithms important for regularization include ridge regression (also known as Tikhonov regularization), lasso and elastic net algorithms, as well as trace plots and cross-validated mean square error. You can also apply Akaike Information Criteria (AIC) as a goodness-of-fit metric.
For more information on regularization techniques, please see Statistics and Machine Learning Toolbox.
Examples and How To
Software Reference
- Regularized Least Squares Regression Using Lasso or Elastic Net Algorithms (Documentation)
- Trace Plot of a Lasso Fit (Documentation)
- Ridge Regression (Documentation)
- Akaike and Bayesian Information Criteria (Documentation)
See also: Statistics and Machine Learning Toolbox, Machine Learning
'Machine Learning > resources' 카테고리의 다른 글
30개의 필수 데이터과학, 머신러닝, 딥러닝 치트시트 (0) | 2017.09.26 |
---|---|
데이터과학 및 딥러닝을 위한 데이터세트 (0) | 2017.07.04 |
Beginning Machine Learning – A few Resources [Subjective] (0) | 2017.06.20 |
SNU TF 스터디 모임 1기 때부터 쭉 모아온 발표자료들 (0) | 2017.04.14 |
List of Free Must-Read Books for Machine Learning (0) | 2017.04.06 |