A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://python.langchain.com/docs/versions/migrating_chains/llm_chain/ below:

Migrating from LLMChain | 🦜️🔗 LangChain

Migrating from LLMChain

LLMChain combined a prompt template, LLM, and output parser into a class.

Some advantages of switching to the LCEL implementation are:

%pip install --upgrade --quiet langchain-openai
import os
from getpass import getpass

if "OPENAI_API_KEY" not in os.environ:
os.environ["OPENAI_API_KEY"] = getpass()
Legacy Details
from langchain.chains import LLMChain
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

prompt = ChatPromptTemplate.from_messages(
[("user", "Tell me a {adjective} joke")],
)

legacy_chain = LLMChain(llm=ChatOpenAI(), prompt=prompt)

legacy_result = legacy_chain({"adjective": "funny"})
legacy_result
{'adjective': 'funny',
'text': "Why couldn't the bicycle stand up by itself?\n\nBecause it was two tired!"}

Note that LLMChain by default returned a dict containing both the input and the output from StrOutputParser, so to extract the output, you need to access the "text" key.

"Why couldn't the bicycle stand up by itself?\n\nBecause it was two tired!"
LCEL Details
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

prompt = ChatPromptTemplate.from_messages(
[("user", "Tell me a {adjective} joke")],
)

chain = prompt | ChatOpenAI() | StrOutputParser()

chain.invoke({"adjective": "funny"})
'Why was the math book sad?\n\nBecause it had too many problems.'

If you'd like to mimic the dict packaging of input and output in LLMChain, you can use a RunnablePassthrough.assign like:

from langchain_core.runnables import RunnablePassthrough

outer_chain = RunnablePassthrough().assign(text=chain)

outer_chain.invoke({"adjective": "funny"})
{'adjective': 'funny',
'text': 'Why did the scarecrow win an award? Because he was outstanding in his field!'}
Next steps

See this tutorial for more detail on building with prompt templates, LLMs, and output parsers.

Check out the LCEL conceptual docs for more background information.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4