This project automates the process of selecting the best models, prompts, or inference parameters for a given use-case, allowing you to iterate over their combinations and to visually inspect the results.
It assumes Ollama is installed and serving endpoints, either in localhost
or in a remote server.
Here's what an experiment for a simple prompt, tested on 3 different models, looks like:
(For a more in-depth look at an evaluation process assisted by this tool, please check https://dezoito.github.io/2023/12/27/rust-ollama-grid-search.html).
Check the releases page for the project, or on the sidebar.
Technically, the term "grid search" refers to iterating over a series of different model hyperparams to optimize model performance, but that usually means parameters like batch_size
, learning_rate
, or number_of_epochs
, more commonly used in training.
But the concept here is similar:
Lets define a selection of models, a prompt and some parameter combinations:
The prompt will be submitted once for each parameter value, for each one of the selected models, generating a set of responses.
Similarly, you can perform A/B tests by selecting different models and compare results for the same prompt/parameter combination, or test different prompts under similar configurations:
Comparing the results of different prompts for the same model
You can save and manage your prompts (we want to make prompts compatible with Open WebUI)
You can autocomplete prompts by typing "/" (inspired by Open WebUI, as well):
You can list, inspect, or download your experiments:
For obvious bugs and spelling mistakes, please go ahead and submit a PR.
If you want to propose a new feature, change existing functionality, or propose anything more complex, please open an issue for discussion, before getting work done on a PR.
The development notes provide setup instructions, sequence diagrams, and workflow charts that should make it easier to understand the project and get started.
The following works and theses have cited this repository:
Inouye, D & Lindo, L, & Lee, R & Allen, E; Computer Science and Engineering Senior Theses: Applied Auto-tuning on LoRA Hyperparameters Santa Clara University, 2024 https://scholarcommons.scu.edu/cgi/viewcontent.cgi?article=1271&context=cseng_senior
Huge thanks to @FabianLars, @peperroni21 and @TomReidNZ.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4