A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/pfrankov/obsidian-local-gpt below:

pfrankov/obsidian-local-gpt: Local Ollama and OpenAI-like GPT's assistance for maximum privacy and offline access

Local GPT plugin for Obsidian


No speedup. MacBook Pro 13, M1, 16GB, Ollama, orca-mini.

The plugin allows you to open a context menu on selected text to pick an AI-assistant's action.
The most casual AI-assistant for Obsidian.

Also works with images

No speedup. MacBook Pro 13, M1, 16GB, Ollama, bakllava.

Also it can use context from links, backlinks and even PDF files (RAG)

How to use (Ollama)

1. Install Embedding model:

2. Select Embedding provider in plugin's settings and try to use the largest model with largest context window.

You can also add yours, share the best actions or get one from the community.

Obsidian plugin store (recommended)

This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=local-gpt

You can also install this plugin via BRAT: pfrankov/obsidian-local-gpt

2. Install AI Providers Plugin

You also need to install AI Providers plugin to configure AI providers from plugin store https://obsidian.md/plugins?id=ai-providers

3. Configure AI Providers

Follow the instructions in AI Providers plugin.

Configure Obsidian hotkey
  1. Open Obsidian Settings
  2. Go to Hotkeys
  3. Filter "Local" and you should see "Local GPT: Show context menu"
  4. Click on + icon and press hotkey (e.g. ⌘ + M)
My other Obsidian plugins

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4