A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/mrdjohnson/llm-x below:

mrdjohnson/llm-x: LLMX; Easiest 3rd party Local LLM UI for the web!

Chrome Extension | Web/Mobile app

LLM X does not make any external api calls. (go ahead, check your network tab and see the Fetch section). Your chats and image generations are 100% private. This site / app works completely offline.

LLM X (web app) will not connect to a server that is not secure. This means that you can use LLM X on localhost (considered a secure context) but if you're trying to use llm-x over a network the server needs to be from https or else it will not work.

Prerequisites for application How to use web client (no install): Prerequisites for web client Prerequisites for chrome extension How to use from project source: Prerequisites for project source Showing Chrome extension mode with Google's on-device Gemini Nano Showing Chrome extension mode with Ollama's llama3.2-vision Showing ability to run ollama and LM Studio at the same time Conversation about logo Image generation example! Showing off omnibar and code Showing off code and light theme Responding about a cat LaTex support! Another logo response

What is this? ChatGPT style UI for the niche group of folks who run Ollama (think of this like an offline chat gpt server) locally. Supports sending and receiving images and text! WORKS OFFLINE through PWA (Progressive Web App) standards (its not dead!)

Why do this? I have been interested in LLM UI for a while now and this seemed like a good intro application. I've been introduced to a lot of modern technologies thanks to this project as well, its been fun!

Why so many buzz words? I couldn't help but bee cool 😎

Tech Stack (thank you's):

Logic helpers:

UI Helpers:

Project setup helpers:

Inspiration: ollama-ui's project. Which allows users to connect to ollama via a web app

Perplexity.ai Perplexity has some amazing UI advancements in the LLM UI space and I have been very interested in getting to that point. Hopefully this starter project lets me get closer to doing something similar!

Getting started with local development

(please note the minimum engine requirements in the package json)

Clone the project, and run yarn in the root directory

yarn dev starts a local instance and opens up a browser tab under https:// (for PWA reasons)

Thanks to typescript, prettier, and biome. LLM-X should be ready for all contributors!


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4