A simple chatbot web application built in Go, Python and Node.js that connects to a local LLM service (llama.cpp) to provide AI-powered responses.
The application uses the following environment variables defined in the .env
file:
LLM_BASE_URL
: The base URL of the LLM APILLM_MODEL_NAME
: The model name to useTo change these settings, simply edit the .env
file in the root directory of the project.
Clone the repository:
git clone https://github.com/docker/hello-genai cd hello-genai
Run the application using the script:
Open your browser and visit the following links:
http://localhost:8080 for the GenAI Application in Go
http://localhost:8081 for the GenAI Application in Python
http://localhost:8082 for the GenAI Application in Node
http://localhost:8083 for the GenAI Application in Rust
If you're using a different LLM server configuration, you may need to modify the.env
file.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4