A lightweight, no-frills RESTful API designed for managing knowledge bases and files stored in vector databasesβno GPU, internet, or cloud services required! LocalRecall provides a simple and generic abstraction layer to handle knowledge retrieval, ideal for AI agents and chatbots to manage both long-term and short-term memory seamlessly.
Currently, LocalRecall is batteries included and supports a local vector store powered by Chromem, with plans to add additional vector stores such as Milvus and Qdrant. It can easily integrate with LocalAI, LocalAGI, and other agent frameworks, offering an intuitive web UI for convenient file management, including support for raw text inputs.
π LocalAI is now part of a comprehensive suite of AI tools designed to work together:
LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that's compatible with OpenAI API specifications for local AI inferencing. Does not require GPU.
A powerful Local AI agent management platform that serves as a drop-in replacement for OpenAI's Responses API, enhanced with advanced agentic capabilities.
git clone https://github.com/mudler/LocalRecall.git cd LocalRecall
Your web UI will be available at http://localhost:8080
.
Build and run using Docker:
docker build -t localrecall . docker run -ti -v $PWD/state:/state \ -e COLLECTION_DB_PATH=/state/db \ -e EMBEDDING_MODEL=granite-embedding-107m-multilingual \ -e FILE_ASSETS=/state/assets \ -e OPENAI_API_KEY=sk-1234567890 \ -e OPENAI_BASE_URL=http://localai:8080 \ -p 8080:8080 localrecall # Or use the images already built by the CI: docker run -ti -v $PWD/state:/state \ -e COLLECTION_DB_PATH=/state/db \ -e EMBEDDING_MODEL=granite-embedding-107m-multilingual \ -e FILE_ASSETS=/state/assets \ -e OPENAI_API_KEY=sk-1234567890 \ -e OPENAI_BASE_URL=http://localai:8080 \ -p 8080:8080 quay.io/mudler/localrecall
or with Docker compose
LocalRecall uses environment variables to configure its behavior. These variables allow you to customize paths, models, and integration settings without modifying the code.
Variable DescriptionCOLLECTION_DB_PATH
Path to the vector database directory where collections are stored. EMBEDDING_MODEL
Name of the embedding model used for vectorization (e.g., granite-embedding-107m-multilingual
). FILE_ASSETS
Directory path to store and retrieve uploaded file assets. OPENAI_API_KEY
API key for embedding services (such as LocalAI or OpenAI-compatible APIs). OPENAI_BASE_URL
Base URL for the embedding model API (commonly http://localai:8080
). LISTENING_ADDRESS
Address the server listens on (default: :8080
). Useful for deployments on custom ports or network interfaces. VECTOR_ENGINE
Vector database engine to use (chromem
by default; support for others like Milvus and Qdrant planned). MAX_CHUNKING_SIZE
Maximum size (in characters) for breaking down documents into chunks. Affects performance and accuracy. API_KEYS
Comma-separated list of API keys for securing access to the REST API (optional). GIT_PRIVATE_KEY
Base64-encoded SSH private key for accessing private Git repositories (optional).
These variables can be passed directly when running the binary or inside your Docker container for easy configuration.
You can use an .env
file to set the variables. The Docker compose file is configured to use an .env
file in the root of the project if available.
Base URL: http://localhost:8080/api
curl -X POST $BASE_URL/collections \ -H "Content-Type: application/json" \ -d '{"name":"myCollection"}'
curl -X POST $BASE_URL/collections/myCollection/upload \ -F "file=@/path/to/file.txt"
curl -X GET $BASE_URL/collections
curl -X GET $BASE_URL/collections/myCollection/entries
curl -X POST $BASE_URL/collections/myCollection/search \ -H "Content-Type: application/json" \ -d '{"query":"search term", "max_results":5}'
curl -X POST $BASE_URL/collections/myCollection/reset
curl -X DELETE $BASE_URL/collections/myCollection/entry/delete \ -H "Content-Type: application/json" \ -d '{"entry":"file.txt"}'
curl -X POST $BASE_URL/collections/myCollection/sources \ -H "Content-Type: application/json" \ -d '{"url":"https://example.com", "update_interval":30}'
The update_interval
is specified in minutes. If not provided, it defaults to 60 minutes.
External sources support various URL types:
For private Git repositories, set the GIT_PRIVATE_KEY
environment variable with a base64-encoded SSH private key:
# Encode your private key export GIT_PRIVATE_KEY=$(cat /path/to/private_key | base64 -w 0)
curl -X DELETE $BASE_URL/collections/myCollection/sources \ -H "Content-Type: application/json" \ -d '{"url":"https://example.com"}'
External sources are automatically monitored and updated in the background. The content is periodically fetched and added to the collection, making it searchable through the regular search endpoint.
Released under the MIT License.
We welcome contributions! Please feel free to:
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4