LocalAI is designed to bridge the gap where traditional tools fall short—offering access to web content, local files, and on-the-go resource fetching.
If you value privacy but still want the power of LLMs, LocalAI is for you. There’s no need to install applications, deal with configurations, or handle version compatibility issues. It works seamlessly with Chrome, Firefox, and their derivatives.
You’ll need enough hardware power, preferably with a GPU.
A local AI tool—these are easy to set up, usually just download and run:
[! Warning]: The following API providers are deprecated for now and will only be reinstated upon request.
- LM Studio - preffered
- Mozilla LlamaFile
And the browser extension.
It’s easy—give it a try and let us know your thoughts!
Important
If you are a Firefox user, please read this.
Here is the documenration where you can get a deepr understanding.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4