A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/benhamner/Metrics below:

GitHub - benhamner/Metrics: Machine learning evaluation metrics, implemented in Python, R, Haskell, and MATLAB

Note: the current releases of this toolbox are a beta release, to test working with Haskell's, Python's, and R's code repositories.

Metrics provides implementations of various supervised machine learning evaluation metrics in the following languages:

For more detailed installation instructions, see the README for each implementation.

Evaluation Metric Python R Haskell MATLAB / Octave Absolute Error (AE) ✓ ✓ ✓ ✓ Average Precision at K (APK, AP@K) ✓ ✓ ✓ ✓ Area Under the ROC (AUC) ✓ ✓ ✓ ✓ Classification Error (CE) ✓ ✓ ✓ ✓ F1 Score (F1) ✓ Gini ✓ Levenshtein ✓ ✓ ✓ Log Loss (LL) ✓ ✓ ✓ ✓ Mean Log Loss (LogLoss) ✓ ✓ ✓ ✓ Mean Absolute Error (MAE) ✓ ✓ ✓ ✓ Mean Average Precision at K (MAPK, MAP@K) ✓ ✓ ✓ ✓ Mean Quadratic Weighted Kappa ✓ ✓ ✓ Mean Squared Error (MSE) ✓ ✓ ✓ ✓ Mean Squared Log Error (MSLE) ✓ ✓ ✓ ✓ Normalized Gini ✓ Quadratic Weighted Kappa ✓ ✓ ✓ Relative Absolute Error (RAE) ✓ Root Mean Squared Error (RMSE) ✓ ✓ ✓ ✓ Relative Squared Error (RSE) ✓ Root Relative Squared Error (RRSE) ✓ Root Mean Squared Log Error (RMSLE) ✓ ✓ ✓ ✓ Squared Error (SE) ✓ ✓ ✓ ✓ Squared Log Error (SLE) ✓ ✓ ✓ ✓ HIGHER LEVEL TRANSFORMATIONS TO HANDLE PROPERTIES METRICS CAN HAVE

(Nonexhaustive and to be added in the future)


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4