R Shiny Interface for Chatting with LLMs Offline via Ollama
Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama
provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.
Development Version
CRAN release
Important: shiny.ollama
requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.
CRAN
Version (0.1.1
):
Development
Version (0.1.2
):
0.1.1
0.1.1
)
install.packages("shiny.ollama")From GitHub (Latest Development Version -
0.1.2
)
# Install devtools if not already installed install.packages("devtools") devtools::install_github("ineelhere/shiny.ollama")
Launch the Shiny app in R with:
library(shiny.ollama) # Start the application shiny.ollama::run_app()Core Features (All Versions)
0.1.2
)
Customize your LLM interaction with adjustable parameters:
To use this package, install Ollama first:
If successful, the version number will be displayed
This R package is an independent, passion-driven open source initiative, released under the Apache License 2.0
. It is not affiliated with, owned by, funded by, or influenced by any external organization. The project is dedicated to fostering a community of developers who share a love for coding and collaborative innovation.
Contributions, feedback, and feature requests are always welcome!
Stay tuned for more updates. 🚀
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4