A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://reference.wolfram.com/language/guide/MachineLearningMethods.html below:

Machine Learning Methods—Wolfram Documentation

Machine Learning Methods—Wolfram Documentation WOLFRAM Products

More mobile apps

Core Technologies of Wolfram Products Deployment Options From the Community Consulting & Solutions

We deliver solutions for the AI era—combining symbolic computation, data-driven insights and deep technical expertise

WolframConsulting.com

Wolfram Solutions

More Wolfram Solutions

Wolfram Solutions For Education

More Solutions for Education

Learning & Support Get Started More Learning Grow Your Skills Tech Support Company Work with Us Educational Programs for Adults Educational Programs for Youth Read Educational Resources Wolfram Initiatives Events Wolfram|Alpha Wolfram Cloud Your Account Search Navigation Menu Wolfram Language & System Documentation Center Machine Learning Methods GUIDE Machine Learning Methods

The Wolfram Language offers a rich selection of machine learning methods to perform regression, classification, clustering, dimensionality reduction and more.

Classification (Classify)

"ClassDistributions" use learned distributions

"DecisionTree" use a decision tree

"GradientBoostedTrees" use an ensemble of trees trained with gradient boosting

"LogisticRegression" use probabilities from linear combinations of features

"Markov" use a Markov model on the feature sequence (only for text, bag of tokens, ...)

"NaiveBayes" classify by assuming probabilistic independence of features

"NearestNeighbors" use nearest-neighbor examples

"NeuralNetwork" use artificial neural networks

"RandomForest" use BreimanCutler ensembles of decision trees

"SupportVectorMachine" use a support vector machine

Regression (Predict)

"DecisionTree" use a decision tree

"GradientBoostedTrees" use an ensemble of trees trained with gradient boosting

"LinearRegression" use a linear combination of features

"NearestNeighbors" use nearest-neighbor examples

"NeuralNetwork" use artificial neural networks

"RandomForest" use BreimanCutler ensembles of decision trees

"GaussianProcess" use a Gaussian process prior over functions

Clustering (FindClusters)

"Agglomerate" single linkage clustering algorithm

"DBSCAN" density-based spatial clustering of applications with noise

"GaussianMixture" use a mixture of Gaussian (normal) distributions

"JarvisPatrick" JarvisPatrick clustering algorithm

"KMeans" k-means clustering algorithm

"KMedoids" partitioning around medoids

"MeanShift" mean-shift clustering algorithm

"NeighborhoodContraction" shift data points toward high-density regions

"SpanningTree" minimum spanning treebased clustering algorithm

"Spectral" spectral clustering algorithm

Distribution Modeling (LearnDistribution)

"ContingencyTable" discretize data and store each possible probability

"DecisionTree" use a decision tree

"GaussianMixture" use a mixture of Gaussian (normal) distributions

"KernelDensityEstimation" use a kernel mixture distribution

"Multinormal" use a multivariate normal (Gaussian) distribution

Dimensionality Reduction (DimensionReduction)

"Autoencoder" use a trainable autoencoder

"Hadamard" project data using a Hadamard matrix

"Isomap" isometric mapping

"LatentSemanticAnalysis" latent semantic analysis method

"Linear" automatically choose the best linear method

"LLE" locally linear embedding

"PrincipalComponentsAnalysis" principal components analysis method

"MultidimensionalScaling" metric multidimensional scaling

"TSNE" t-distributed stochastic neighbor embedding algorithm

"UMAP" uniform manifold approximation and projection

Related Workflow Guides Related Guides Related Links Top

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4