A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://scikit-learn.org/dev/modules/../auto_examples/ensemble/plot_voting_regressor.html below:

Plot individual and voting regression predictions — scikit-learn 1.8.dev0 documentation

Note

Go to the end to download the full example code. or to run this example in your browser via JupyterLite or Binder

Plot individual and voting regression predictions#

A voting regressor is an ensemble meta-estimator that fits several base regressors, each on the whole dataset. Then it averages the individual predictions to form a final prediction. We will use three different regressors to predict the data: GradientBoostingRegressor, RandomForestRegressor, and LinearRegression). Then the above 3 regressors will be used for the VotingRegressor.

Finally, we will plot the predictions made by all models for comparison.

We will work with the diabetes dataset which consists of 10 features collected from a cohort of diabetes patients. The target is a quantitative measure of disease progression one year after baseline.

Training classifiers#

First, we will load the diabetes dataset and initiate a gradient boosting regressor, a random forest regressor and a linear regression. Next, we will use the 3 regressors to build the voting regressor:

X, y = load_diabetes(return_X_y=True)

# Train classifiers
reg1 = GradientBoostingRegressor(random_state=1)
reg2 = RandomForestRegressor(random_state=1)
reg3 = LinearRegression()

reg1.fit(X, y)
reg2.fit(X, y)
reg3.fit(X, y)

ereg = VotingRegressor([("gb", reg1), ("rf", reg2), ("lr", reg3)])
ereg.fit(X, y)
VotingRegressor(estimators=[('gb', GradientBoostingRegressor(random_state=1)),
                            ('rf', RandomForestRegressor(random_state=1)),
                            ('lr', LinearRegression())])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
Parameters estimators  [('gb', ...), ('rf', ...), ...] weights  None n_jobs  None verbose  False

GradientBoostingRegressor

Parameters loss  'squared_error' learning_rate  0.1 n_estimators  100 subsample  1.0 criterion  'friedman_mse' min_samples_split  2 min_samples_leaf  1 min_weight_fraction_leaf  0.0 max_depth  3 min_impurity_decrease  0.0 init  None random_state  1 max_features  None alpha  0.9 verbose  0 max_leaf_nodes  None warm_start  False validation_fraction  0.1 n_iter_no_change  None tol  0.0001 ccp_alpha  0.0 Parameters n_estimators  100 criterion  'squared_error' max_depth  None min_samples_split  2 min_samples_leaf  1 min_weight_fraction_leaf  0.0 max_features  1.0 max_leaf_nodes  None min_impurity_decrease  0.0 bootstrap  True oob_score  False n_jobs  None random_state  1 verbose  0 warm_start  False ccp_alpha  0.0 max_samples  None monotonic_cst  None Parameters fit_intercept  True copy_X  True tol  1e-06 n_jobs  None positive  False Making predictions#

Now we will use each of the regressors to make the 20 first predictions.

xt = X[:20]

pred1 = reg1.predict(xt)
pred2 = reg2.predict(xt)
pred3 = reg3.predict(xt)
pred4 = ereg.predict(xt)
Plot the results#

Finally, we will visualize the 20 predictions. The red stars show the average prediction made by VotingRegressor.

plt.figure()
plt.plot(pred1, "gd", label="GradientBoostingRegressor")
plt.plot(pred2, "b^", label="RandomForestRegressor")
plt.plot(pred3, "ys", label="LinearRegression")
plt.plot(pred4, "r*", ms=10, label="VotingRegressor")

plt.tick_params(axis="x", which="both", bottom=False, top=False, labelbottom=False)
plt.ylabel("predicted")
plt.xlabel("training samples")
plt.legend(loc="best")
plt.title("Regressor predictions and their average")

plt.show()

Total running time of the script: (0 minutes 1.364 seconds)

Related examples

Gallery generated by Sphinx-Gallery


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4