A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://ts.llamaindex.ai/docs/llamaindex/tutorials/workflows below:

Build LLM-powered document agents and workflows

Tutorials

Workflows

A Workflow in LlamaIndex is a lightweight, event-driven abstraction used to chain together several events. Workflows are made up of handlers, with each one responsible for processing specific event types and emitting new events.

Workflows are designed to be flexible and can be used to build agents, RAG flows, extraction flows, or anything else you want to implement.

Let's explore a simple workflow example where a joke is generated and then critiqued and iterated on:

import { openai } from "@llamaindex/openai";
import {
  createStatefulMiddleware,
  createWorkflow,
  workflowEvent,
} from "@llamaindex/workflow";

// Create LLM instance
const llm = openai({ model: "gpt-4.1-mini" });

// Define our workflow events
const startEvent = workflowEvent<string>(); // Input topic for joke
const jokeEvent = workflowEvent<{ joke: string }>(); // Intermediate joke
const critiqueEvent = workflowEvent<{ joke: string; critique: string }>(); // Intermediate critique
const resultEvent = workflowEvent<{ joke: string; critique: string }>(); // Final joke + critique

// Create our workflow
const { withState, getContext } = createStatefulMiddleware(() => ({
  numIterations: 0,
  maxIterations: 3,
}));
const jokeFlow = withState(createWorkflow());

// Define handlers for each step
jokeFlow.handle([startEvent], async (event) => {
  // Prompt the LLM to write a joke
  const prompt = `Write your best joke about ${event.data}. Write the joke between <joke> and </joke> tags.`;
  const response = await llm.complete({ prompt });

  // Parse the joke from the response
  const joke =
    response.text.match(/<joke>([\s\S]*?)<\/joke>/)?.[1]?.trim() ??
    response.text;
  return jokeEvent.with({ joke: joke });
});

jokeFlow.handle([jokeEvent], async (event) => {
  // Prompt the LLM to critique the joke
  const prompt = `Give a thorough critique of the following joke. If the joke needs improvement, put "IMPROVE" somewhere in the critique: ${event.data.joke}`;
  const response = await llm.complete({ prompt });

  // If the critique includes "IMPROVE", keep iterating, else, return the result
  if (response.text.includes("IMPROVE")) {
    return critiqueEvent.with({
      joke: event.data.joke,
      critique: response.text,
    });
  }

  return resultEvent.with({ joke: event.data.joke, critique: response.text });
});

jokeFlow.handle([critiqueEvent], async (event) => {
  // Keep track of the number of iterations
  const state = getContext().state;
  state.numIterations++;

  // Write a new joke based on the previous joke and critique
  const prompt = `Write a new joke based on the following critique and the original joke. Write the joke between <joke> and </joke> tags.\n\nJoke: ${event.data.joke}\n\nCritique: ${event.data.critique}`;
  const response = await llm.complete({ prompt });

  // Parse the joke from the response
  const joke =
    response.text.match(/<joke>([\s\S]*?)<\/joke>/)?.[1]?.trim() ??
    response.text;

  // If we've done less than the max number of iterations, keep iterating
  // else, return the result
  if (state.numIterations < state.maxIterations) {
    return jokeEvent.with({ joke: joke });
  }

  return resultEvent.with({ joke: joke, critique: event.data.critique });
});

// Usage
async function main() {
  const { stream, sendEvent } = jokeFlow.createContext();
  sendEvent(startEvent.with("pirates"));

  let result: { joke: string; critique: string } | undefined;

  for await (const event of stream) {
    // console.log(event.data);  optionally log the event data
    if (resultEvent.include(event)) {
      result = event.data;
      break; // Stop when we get the final result
    }
  }

  console.log(result);
}

main().catch(console.error);

There are a few moving pieces here, so let's go through this step by step.

Defining Workflow Events
const startEvent = workflowEvent<string>(); // Input topic for joke
const jokeEvent = workflowEvent<{ joke: string }>(); // Intermediate joke
const critiqueEvent = workflowEvent<{ joke: string; critique: string }>(); // Intermediate critique
const resultEvent = workflowEvent<{ joke: string; critique: string }>(); // Final joke + critique

Events are defined using the workflowEvent function and contain arbitrary data provided as a generic type. In this example, we have four events:

Setting up the Workflow with Stateful Middleware
const { withState, getContext } = createStatefulMiddleware(() => ({
  numIterations: 0,
  maxIterations: 3,
}));
const jokeFlow = withState(createWorkflow());

Our workflow is implemented using the createWorkflow() function, enhanced with the withState middleware. This middleware provides shared state across all handlers, which in this case tracks:

This state will be accessible within workflows by using the getContext().state function.

Adding Handlers with Loops

We have three key handlers in our workflow:

  1. The first handler processes the startEvent, generates an initial joke, and emits a jokeEvent:
jokeFlow.handle([startEvent], async (event) => {
  // Prompt the LLM to write a joke
  const prompt = `Write your best joke about ${event.data}. Write the joke between <joke> and </joke> tags.`;
  const response = await llm.complete({ prompt });
 
  // Parse the joke from the response
  const joke =
    response.text.match(/<joke>([\s\S]*?)<\/joke>/)?.[1]?.trim() ??
    response.text;
  return jokeEvent.with({ joke: joke });
});
  1. The second handler handles the jokeEvent, critiques the joke, and either:
jokeFlow.handle([jokeEvent], async (event) => {
  // Prompt the LLM to critique the joke
  const prompt = `Give a thorough critique of the following joke. If the joke needs improvement, put "IMPROVE" somewhere in the critique: ${event.data.joke}`;
  const response = await llm.complete({ prompt });
 
  // If the critique includes "IMPROVE", keep iterating, else, return the result
  if (response.text.includes("IMPROVE")) {
    return critiqueEvent.with({
      joke: event.data.joke,
      critique: response.text,
    });
  }
 
  return resultEvent.with({ joke: event.data.joke, critique: response.text });
});
  1. The third handler processes the critiqueEvent, generates an improved joke based on the critique, and either:
jokeFlow.handle([critiqueEvent], async (event) => {
  // Keep track of the number of iterations
  const state = getContext().state;
  state.numIterations++;
 
  // Write a new joke based on the previous joke and critique
  const prompt = `Write a new joke based on the following critique and the original joke. Write the joke between <joke> and </joke> tags.\n\nJoke: ${event.data.joke}\n\nCritique: ${event.data.critique}`;
  const response = await llm.complete({ prompt });
 
  // Parse the joke from the response
  const joke =
    response.text.match(/<joke>([\s\S]*?)<\/joke>/)?.[1]?.trim() ??
    response.text;
 
  // If we've done less than the max number of iterations, keep iterating
  // else, return the result
  if (state.numIterations < state.maxIterations) {
    return jokeEvent.with({ joke: joke });
  }
 
  return resultEvent.with({ joke: joke, critique: event.data.critique });
});
Running the Workflow
async function main() {
  const { stream, sendEvent } = jokeFlow.createContext();
  sendEvent(startEvent.with("pirates"));

  let result: { joke: string, critique: string } | undefined;

  for await (const event of stream) {
    // console.log(event.data);  optionally log the event data
    if (resultEvent.include(event)) {
      result = event.data;
      break; // Stop when we get the final result
    }
  }
  
  console.log(result);
}

To run the workflow, we:

  1. Create a workflow context with createContext()
  2. Trigger the initial event with sendEvent()
  3. Listen to the event stream and process events as they arrive
  4. Use include() to check if an event is of a specific type
  5. Break the loop when we receive our final result
Using Stream Utilities

The stream returned by createContext contains utility functions to make working with event streams easier:

// Create a workflow context and send the initial event
const { stream, sendEvent } = jokeFlow.createContext();
sendEvent(startEvent.with("pirates"));

// Collect all events until we get a resultEvent
const allEvents = await stream.until(resultEvent).toArray(); 

// The last event will be the resultEvent
const finalEvent = allEvents.at(-1);
console.log(finalEvent.data); // Output the joke and critique

The stream utilities make it easier to work with the asynchronous event flow. In this example, we use:

You can combine these utilities with other stream operators like filter and map to create powerful processing pipelines.

To learn more about workflows, check out the Workflows documentation.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4