Showing content from http://reference.wolfram.com/language/ref/method/ContingencyTable.html below:
ContingencyTable—Wolfram Language Documentation
WOLFRAM Consulting & Solutions
We deliver solutions for the AI eraâcombining symbolic computation, data-driven insights and deep technology expertise.
- Data & Computational Intelligence
- Model-Based Design
- Algorithm Development
- Wolfram|Alpha for Business
- Blockchain Technology
- Education Technology
- Quantum Computation
WolframConsulting.com
METHOD
"ContingencyTable" (Machine Learning Method)
- Method for LearnDistribution.
- Use a table to store the probabilities of a nominal vector for each possible outcome.
Details & Suboptions
- A contingency table models the probability distribution of a nominal vector space by storing a probability value for each possible outcome.
- If the data is unidimensional, the distribution corresponds to a categorical distribution.
- The following options can be given:
- "AdditiveSmoothing" Automatic value to be added to each count
- If the data contains numerical values, they are discretized. The resulting distribution is still a valid distribution in the original space.
- Information[LearnedDistribution[…],"MethodOption"] can be used to extract the values of options chosen by the automation system.
- LearnDistribution[…,FeatureExtractor"Minimal"] can be used to remove most preprocessing and directly access the method.
Examplesopen allclose all Basic Examples (3)
Train a contingency-table distribution on a nominal dataset:
Look at the distribution Information:
Obtain options information:
Obtain an option value directly:
Compute the probabilities for the values "A" and "B":
Generate new samples:
Train a contingency-table distribution on a numeric dataset:
Look at the distribution Information:
Compute the probability density for a new example:
Plot the PDF along with the training data:
Generate and visualize new samples:
Train a contingency-table distribution on a two-dimensional dataset:
Plot the PDF along with the training data:
Use SynthesizeMissingValues to impute missing values using the learned distribution:
Options (1) "AdditiveSmoothing" (1)
Train a contingency-table distribution on a nominal dataset without any smoothing:
Compute the probabilities for the values "A" and "B":
Compare with the probabilities obtained after adding 1 and 10 counts to each outcome:
RetroSearch is an open source project built by @garambo
| Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4