A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/langchain-ai/open-canvas below:

langchain-ai/open-canvas: 📃 A better UX for chat, writing content, and coding with LLMs.

TRY IT OUT HERE

Open Canvas is an open source web application for collaborating with agents to better write documents. It is inspired by OpenAI's "Canvas", but with a few key differences.

  1. Open Source: All the code, from the frontend, to the content generation agent, to the reflection agent is open source and MIT licensed.
  2. Built in memory: Open Canvas ships out of the box with a reflection agent which stores style rules and user insights in a shared memory store. This allows Open Canvas to remember facts about you across sessions.
  3. Start from existing documents: Open Canvas allows users to start with a blank text, or code editor in the language of their choice, allowing you to start the session with your existing content, instead of being forced to start with a chat interaction. We believe this is an ideal UX because many times you will already have some content to start with, and want to iterate on-top of it.

This guide will cover how to setup and run Open Canvas locally. If you prefer a YouTube video guide, check out this video.

Open Canvas requires the following API keys and external services:

First, clone the repository:

git clone https://github.com/langchain-ai/open-canvas.git
cd open-canvas

Next, install the dependencies:

After installing dependencies, copy the contents of both .env.example files in the root of the project, and in apps/web into .env and set the required values:

# The root `.env` file will be read by the LangGraph server for the agents.
cp .env.example .env
# The `apps/web/.env` file will be read by the frontend.
cd apps/web/
cp .env.example .env

Then, setup authentication with Supabase.

After creating a Supabase account, visit your dashboard and create a new project.

Next, navigate to the Project Settings page inside your project, and then to the API tag. Copy the Project URL, and anon public project API key. Paste them into the NEXT_PUBLIC_SUPABASE_URL and NEXT_PUBLIC_SUPABASE_ANON_KEY environment variables in the apps/web/.env file.

After this, navigate to the Authentication page, and the Providers tab. Make sure Email is enabled (also ensure you've enabled Confirm Email). You may also enable GitHub, and/or Google if you'd like to use those for authentication. (see these pages for documentation on how to setup each provider: GitHub, Google)

To verify authentication works, run yarn dev and visit localhost:3000. This should redirect you to the login page. From here, you can either login with Google or GitHub, or if you did not configure these providers, navigate to the signup page and create a new account with an email and password. This should then redirect you to a conformation page, and after confirming your email you should be redirected to the home page.

The first step to running Open Canvas locally is to build the application. This is because Open Canvas uses a monorepo setup, and requires workspace dependencies to be build so other packages/apps can access them.

Run the following command from the root of the repository:

Now we'll cover how to setup and run the LangGraph server locally.

Navigate to apps/agents and run yarn dev (this runs npx @langchain/langgraph-cli dev --port 54367).

Ready!
- 🚀 API: http://localhost:54367
- 🎨 Studio UI: https://smith.langchain.com/studio?baseUrl=http://localhost:54367

After your LangGraph server is running, execute the following command inside apps/web to start the Open Canvas frontend:

On initial load, compilation may take a little bit of time.

Then, open localhost:3000 with your browser and start interacting!

Open Canvas is designed to be compatible with any LLM model. The current deployment has the following models configured:

If you'd like to add a new model, follow these simple steps:

  1. Add to or update the model provider variables in packages/shared/src/models.ts.
  2. Install the necessary package for the provider (e.g. @langchain/anthropic) inside apps/agents.
  3. Update the getModelConfig function in apps/agents/src/agent/utils.ts to include an if statement for your new model name and provider.
  4. Manually test by checking you can:
    • 4a. Generate a new artifact
    • 4b. Generate a followup message (happens automatically after generating an artifact)
    • 4c. Update an artifact via a message in chat
    • 4d. Update an artifact via a quick action
    • 4e. Repeat for text/code (ensure both work)

Open Canvas supports calling local LLMs running on Ollama. This is not enabled in the hosted version of Open Canvas, but you can use this in your own local/deployed Open Canvas instance.

To use a local Ollama model, first ensure you have Ollama installed, and a model that supports tool calling pulled (the default model is llama3.3).

Next, start the Ollama server by running ollama run llama3.3.

Then, set the NEXT_PUBLIC_OLLAMA_ENABLED environment variable to true, and the OLLAMA_API_URL environment variable to the URL of your Ollama server (defaults to http://host.docker.internal:11434. If you do not set a custom port when starting your Ollama server, you should not need to set this environment variable).

Note

Open source LLMs are typically not as good at instruction following as proprietary models like GPT-4o or Claude Sonnet. Because of this, you may experience errors or unexpected behavior when using local LLMs.

Below are some common issues you may run into if running Open Canvas yourself:

Below is a list of features we'd like to add to Open Canvas in the near future:

Do you have a feature request? Please open an issue!

We'd like to continue developing and improving Open Canvas, and want your help!

To start, there are a handful of GitHub issues with feature requests outlining improvements and additions to make the app's UX even better. There are three main labels:

If you have questions about contributing, please reach out to me via email: brace(at)langchain(dot)dev. For general bugs/issues with the code, please open an issue on GitHub.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4