OpenLLM lets developers run any open-source LLMs as OpenAI-compatible API endpoints with a single command.
OpenLLM supports a wide range of open-source LLMs as well as serving users' own fine-tuned LLMs. Use openllm model
command to see all available models that are pre-optimized for OpenLLM.
There is a OpenLLM Wrapper which supports interacting with running server with OpenLLM:
This wrapper supports interacting with OpenLLM's OpenAI-compatible endpoint.
from langchain_community.llms import OpenLLM
llm = OpenLLM(base_url="http://localhost:3000/v1", api_key="na")
llm("What is the difference between a duck and a goose? And why there are so many Goose in Canada?")
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4