A variety of linear models.
User guide. See the Linear Models section for further details.
The following subsections are only rough guidelines: the same estimator can fall into multiple categories, depending on its parameters.
Linear classifiers# Classical linear regressors# Regressors with variable selection#The following estimators have built-in variable selection fitting procedures, but any estimator using a L1 or elastic-net penalty also performs variable selection: typically SGDRegressor
or SGDClassifier
with an appropriate penalty.
These estimators fit multiple regression problems (or tasks) jointly, while inducing sparse coefficients. While the inferred coefficients may differ between the tasks, they are constrained to agree on the features that are selected (non-zero coefficients).
Outlier-robust regressors#Any estimator using the Huber loss would also be robust to outliers, e.g., SGDRegressor
with loss='huber'
.
These models allow for response variables to have error distributions other than a normal distribution.
Miscellaneous#RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4