public sealed class FastForestBinaryTrainer : Microsoft.ML.Trainers.FastTree.RandomForestTrainerBase<Microsoft.ML.Trainers.FastTree.FastForestBinaryTrainer.Options,Microsoft.ML.Data.BinaryPredictionTransformer<Microsoft.ML.Trainers.FastTree.FastForestBinaryModelParameters>,Microsoft.ML.Trainers.FastTree.FastForestBinaryModelParameters>
type FastForestBinaryTrainer = class
inherit RandomForestTrainerBase<FastForestBinaryTrainer.Options, BinaryPredictionTransformer<FastForestBinaryModelParameters>, FastForestBinaryModelParameters>
Public NotInheritable Class FastForestBinaryTrainer
Inherits RandomForestTrainerBase(Of FastForestBinaryTrainer.Options, BinaryPredictionTransformer(Of FastForestBinaryModelParameters), FastForestBinaryModelParameters)
To create this trainer, use FastForest or FastForest(Options).
Input and Output ColumnsThe input label column data must be Boolean. The input features column data must be a known-sized vector of Single.
This trainer outputs the following columns:
Output Column Name Column Type DescriptionScore
Single The unbounded score that was calculated by the model. PredictedLabel
Boolean The predicted label, based on the sign of the score. A negative score maps to false
and a positive score maps to true
. Probability
Single The probability calculated by calibrating the score of having true as the label. Probability value is in range [0, 1]. Trainer Characteristics Machine learning task Binary classification Is normalization required? No Is caching required? No Required NuGet in addition to Microsoft.ML Microsoft.ML.FastTree Exportable to ONNX Yes Training Algorithm Details
Decision trees are non-parametric models that perform a sequence of simple tests on inputs. This decision procedure maps them to outputs found in the training dataset whose inputs were similar to the instance being processed. A decision is made at each node of the binary tree data structure based on a measure of similarity that maps each instance recursively through the branches of the tree until the appropriate leaf node is reached and the output decision returned.
Decision trees have several advantages:
Fast forest is a random forest implementation. The model consists of an ensemble of decision trees. Each tree in a decision forest outputs a Gaussian distribution by way of prediction. An aggregation is performed over the ensemble of trees to find a Gaussian distribution closest to the combined distribution for all trees in the model. This decision forest classifier consists of an ensemble of decision trees.
Generally, ensemble models provide better coverage and accuracy than single decision trees. Each tree in a decision forest outputs a Gaussian distribution.
For more see:
Check the See Also section for links to examples of the usage.
Fields Properties Methods Extension Methods See alsoRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4