LangChain tools that use Prolog rules to generate answers.
OverviewβThe PrologTool class allows the generation of langchain tools that use Prolog rules to generate answers.
SetupβLet's use the following Prolog rules in the file family.pl:
parent(john, bianca, mary).
parent(john, bianca, michael).
parent(peter, patricia, jennifer).
partner(X, Y) :- parent(X, Y, _).
from langchain_prolog import PrologConfig, PrologRunnable, PrologTool
TEST_SCRIPT = "family.pl"
Instantiationβ
First create the Prolog tool:
schema = PrologRunnable.create_schema("parent", ["men", "women", "child"])
config = PrologConfig(
rules_file=TEST_SCRIPT,
query_schema=schema,
)
prolog_tool = PrologTool(
prolog_config=config,
name="family_query",
description="""
Query family relationships using Prolog.
parent(X, Y, Z) implies only that Z is a child of X and Y.
Input can be a query string like 'parent(john, X, Y)' or 'john, X, Y'"
You have to specify 3 parameters: men, woman, child. Do not use quotes.
""",
)
Invocationβ Using a Prolog tool with an LLM and function callingβ
from dotenv import find_dotenv, load_dotenv
load_dotenv(find_dotenv(), override=True)
from langchain_core.messages import HumanMessage
from langchain_openai import ChatOpenAI
To use the tool, bind it to the LLM model:
llm = ChatOpenAI(model="gpt-4o-mini")
llm_with_tools = llm.bind_tools([prolog_tool])
and then query the model:
query = "Who are John's children?"
messages = [HumanMessage(query)]
response = llm_with_tools.invoke(messages)
The LLM will respond with a tool call request:
messages.append(response)
response.tool_calls[0]
{'name': 'family_query',
'args': {'men': 'john', 'women': None, 'child': None},
'id': 'call_gH8rWamYXITrkfvRP2s5pkbF',
'type': 'tool_call'}
The tool takes this request and queries the Prolog database:
tool_msg = prolog_tool.invoke(response.tool_calls[0])
The tool returns a list with all the solutions for the query:
messages.append(tool_msg)
tool_msg
ToolMessage(content='[{"Women": "bianca", "Child": "mary"}, {"Women": "bianca", "Child": "michael"}]', name='family_query', tool_call_id='call_gH8rWamYXITrkfvRP2s5pkbF')
That we then pass to the LLM, and the LLM answers the original query using the tool response:
answer = llm_with_tools.invoke(messages)
print(answer.content)
John has two children: Mary and Michael, with Bianca as their mother.
Chainingβ Using a Prolog Tool with an agentβ
To use the prolog tool with an agent, pass it to the agent's constructor:
from langgraph.prebuilt import create_react_agent
agent_executor = create_react_agent(llm, [prolog_tool])
The agent takes the query and use the Prolog tool if needed:
messages = agent_executor.invoke({"messages": [("human", query)]})
Then the agent receivesβ the tool response and generates the answer:
messages["messages"][-1].pretty_print()
==================================[1m Ai Message [0m==================================
John has two children: Mary and Michael, with Bianca as their mother.
API referenceβ
See https://langchain-prolog.readthedocs.io/en/latest/modules.html for detail.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4