A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://python.langchain.com/docs/integrations/tools/robocorp below:

Robocorp Toolkit | 🦜️🔗 LangChain

Robocorp Toolkit

This notebook covers how to get started with Robocorp Action Server action toolkit and LangChain.

Robocorp is the easiest way to extend the capabilities of AI agents, assistants and copilots with custom actions.

Installation

First, see the Robocorp Quickstart on how to setup Action Server and create your Actions.

In your LangChain application, install the langchain-robocorp package:


%pip install --upgrade --quiet langchain-robocorp

When you create the new Action Server following the above quickstart.

It will create a directory with files, including action.py.

We can add python function as actions as shown here.

Let's add a dummy function to action.py.

@action
def get_weather_forecast(city: str, days: int, scale: str = "celsius") -> str:
"""
Returns weather conditions forecast for a given city.

Args:
city (str): Target city to get the weather conditions for
days: How many day forecast to return
scale (str): Temperature scale to use, should be one of "celsius" or "fahrenheit"

Returns:
str: The requested weather conditions forecast
"""
return "75F and sunny :)"

We then start the server:

And we can see:

Found new action: get_weather_forecast

Test locally by going to the server running at http://localhost:8080 and use the UI to run the function.

Environment Setup

Optionally you can set the following environment variables:

Usage

We started the local action server, above, running on http://localhost:8080.

from langchain.agents import AgentExecutor, OpenAIFunctionsAgent
from langchain_core.messages import SystemMessage
from langchain_openai import ChatOpenAI
from langchain_robocorp import ActionServerToolkit


llm = ChatOpenAI(model="gpt-4", temperature=0)


toolkit = ActionServerToolkit(url="http://localhost:8080", report_trace=True)
tools = toolkit.get_tools()


system_message = SystemMessage(content="You are a helpful assistant")
prompt = OpenAIFunctionsAgent.create_prompt(system_message)
agent = OpenAIFunctionsAgent(llm=llm, prompt=prompt, tools=tools)

executor = AgentExecutor(agent=agent, tools=tools, verbose=True)

executor.invoke("What is the current weather today in San Francisco in fahrenheit?")


> Entering new AgentExecutor chain...

Invoking: `robocorp_action_server_get_weather_forecast` with `{'city': 'San Francisco', 'days': 1, 'scale': 'fahrenheit'}`


"75F and sunny :)"The current weather today in San Francisco is 75F and sunny.

> Finished chain.
{'input': 'What is the current weather today in San Francisco in fahrenheit?',
'output': 'The current weather today in San Francisco is 75F and sunny.'}
Single input tools

By default toolkit.get_tools() will return the actions as Structured Tools.

To return single input tools, pass a Chat model to be used for processing the inputs.


toolkit = ActionServerToolkit(url="http://localhost:8080")
tools = toolkit.get_tools(llm=llm)

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4