This project shows how to make a chat with LLM usgin LangChain and Blazor.
To run the chat with default configuration you would need docker
and ollama container. Follow the installation steps for ollama container. When eterything is ready, pull the latest mistral model:
docker exec -it ollama ollama pull mistral:latest
This will take some minutes, depending on your internet speed.
Now clone the project into any folder you like:
git clone https://github.com/TesAnti/LangChainChat.git
Run it with editor of your choise.
Just clone the project and open LangChainConfigExtensions.cs
file. There you can change your model provider, chain and add more models.
You can change provided from local ollama to ChatGPT or, pretty much, any existing model supported by LangChain. For more information see LangChain wiki.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4