This minimalistic UI is designed to act as a simple interface for Ollama models, allowing you to chat with your models, save conversations and toggle between different ones easily. The tool is built using React, Next.js, and Tailwind CSS, with LangchainJs and Ollama providing the magic behind the scenes.
Download and run Ollama on your machine with ollama serve
or ollama run <model-name>
(it will run at: http://localhost:11434/)
Open a new terminal and navigate to the root of this project.
Install the dependencies npm install
in your terminal.
Also check whether your node by doing:
If it is less than 14.0.1. You can do this to update it:
Install n using npm (Node.js package manager):
bash:
Use n to install a specific Node.js version: bash:
Verify the Node.js version:
.env.example
to .env.local
and setting the environment variable NEXT_PUBLIC_OLLAMA_BASEURL
. If not set, the base URL will default to http://localhost:11434
.npm run dev
(it should be available in your web browser at http://localhost:3000
)If you encounter any issues, feel free to reach out!
This project is licensed under the MIT License. See LICENSE
file for details.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4