Alpaca is an Ollama client where you can manage and chat with multiple models, Alpaca provides an easy and beginner friendly way of interacting with local AI, everything is open source and powered by Ollama.
You can also use third party AI providers such as Gemini, ChatGPT and more!
Warning
This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any AI models.
Important
Please be aware that GNOME Code of Conduct applies to Alpaca before interacting with this repository.
Warning
AI generated issues and PRs will be denied, repeated offense will result in a ban from the repository, AI can be a useful tool but I don't want Alpaca to be vibe-developed, thanks.
Note
Available since Alpaca 6.0.0
Note
It uses the default model from the latest instance you've used to generate the messages
Quick Ask is a mini mode you can use to have a quick temporary chat that isn't saved as a full chat.
Great for asking something, getting a response and moving on with your work.
flatpak run com.jeffser.Alpaca --quick-ask
Note
Available since Alpaca 7.0.0
Live Chat is a great way of having a live conversation with your models as if you were in a call with them.
Great for natural conversations with models and roleplay.
flatpak run com.jeffser.Alpaca --live-chat
You can add your respective command as as keyboard shortcut in your system settings to quickly access Alpaca at anytime!
Want to add a language? Visit this discussion to get started!
If you want to package Alpaca in a different packaging method please read this wiki page.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4