Regularization resources in matlab
Prevent statistical overfitting with regularization techniques
Regularization techniques are used to prevent statistical overfitting in a predictive model. By introducing additional information into the model, regularization algorithms can deal with multicollinearity and redundant predictors by making the model more parsimonious and accurate. These algorithms typically work by applying a penalty for complexity such as by adding the coefficients of the model into the minimization or including a roughness penalty.
Techniques and algorithms important for regularization include ridge regression (also known as Tikhonov regularization), lasso and elastic net algorithms, as well as trace plots and cross-validated mean square error. You can also apply Akaike Information Criteria (AIC) as a goodness-of-fit metric.
For more information on regularization techniques, please see Statistics and Machine Learning Toolbox.
Examples and How To
Software Reference
- Regularized Least Squares Regression Using Lasso or Elastic Net Algorithms (Documentation)
- Trace Plot of a Lasso Fit (Documentation)
- Ridge Regression (Documentation)
- Akaike and Bayesian Information Criteria (Documentation)
See also: Statistics and Machine Learning Toolbox, Machine Learning