A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/qusaismael/localllm below:

qusaismael/localllm: Your LLM, Your Data , Your GUI.

A simple Flask-based web GUI that enables local AI (LLMs) inference using ollama for model serving. This project is currently in Alpha phase and open to any contributions. Created by @qusaismael.

If you'd like to support this project, consider donating via PayPal:

System Requirements & Recommendations
  1. Clone Repository

    git clone https://github.com/qusaismael/localllm.git
    cd localllm
  2. Setup Virtual Environment

    # Linux/macOS
    python3 -m venv venv
    source venv/bin/activate
    
    # Windows
    python -m venv venv
    venv\Scripts\activate
  3. Install Dependencies

    pip install -r requirements.txt
  4. Configure Ollama

  1. Start Server

    Access at http://localhost:5000

  2. First-Time Setup

  3. Basic Operations

⚠️ Important Security Considerations:

Common Issues:

  1. "Model not found" error

  2. Port conflict Modify PORT variable in app.py

  3. Slow responses

  4. Windows path issues Update OLLAMA_PATH in app.py to your installation path

Alpha Release
Current version: 0.1.0

Known Limitations:

Welcome! Please follow these steps:

  1. Fork repository
  2. Create feature branch
  3. Submit PR with description

Development Setup:

pip install -r requirements-dev.txt
pre-commit install

Guidelines:

MIT License - See LICENSE for details

Created by @qusaismael
Open Source • Contributions Welcome!


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4