No speedup. MacBook Pro 13, M1, 16GB, Ollama, orca-mini.
The plugin allows you to open a context menu on selected text to pick an AI-assistant's action.
The most casual AI-assistant for Obsidian.
Also works with images
No speedup. MacBook Pro 13, M1, 16GB, Ollama, bakllava.
Also it can use context from links, backlinks and even PDF files (RAG)
1. Install Embedding model:
ollama pull nomic-embed-text
(fastest)ollama pull bge-m3
(slower, but more accurate)2. Select Embedding provider in plugin's settings and try to use the largest model with largest context window.
You can also add yours, share the best actions or get one from the community.
This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=local-gpt
You can also install this plugin via BRAT: pfrankov/obsidian-local-gpt
You also need to install AI Providers plugin to configure AI providers from plugin store https://obsidian.md/plugins?id=ai-providers
3. Configure AI ProvidersFollow the instructions in AI Providers plugin.
Configure Obsidian hotkey+
icon and press hotkey (e.g. ⌘ + M
)RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4