LangChain is a framework for developing applications powered by large language models (LLMs).
LangChain simplifies every stage of the LLM application lifecycle:
LangChain implements a standard interface for large language models and related technologies, such as embedding models and vector stores, and integrates with hundreds of providers. See the integrations page for more.
pip install -qU "langchain[google-genai]"
import getpass
import os
if not os.environ.get("GOOGLE_API_KEY"):
os.environ["GOOGLE_API_KEY"] = getpass.getpass("Enter API key for Google Gemini: ")
from langchain.chat_models import init_chat_model
model = init_chat_model("gemini-2.5-flash", model_provider="google_genai")
model.invoke("Hello, world!")
note
These docs focus on the Python LangChain library. Head here for docs on the JavaScript LangChain library.
ArchitectureβThe LangChain framework consists of multiple open-source libraries. Read more in the Architecture page.
langchain-core
: Base abstractions for chat models and other components.langchain-openai
, langchain-anthropic
, etc.): Important integrations have been split into lightweight packages that are co-maintained by the LangChain team and the integration developers.langchain
: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.langchain-community
: Third-party integrations that are community maintained.langgraph
: Orchestration framework for combining LangChain components into production-ready applications with persistence, streaming, and other key features. See LangGraph documentation.If you're looking to build something specific or are more of a hands-on learner, check out our tutorials section. This is the best place to get started.
These are the best ones to get started with:
Explore the full list of LangChain tutorials here, and check out other LangGraph tutorials here. To learn more about LangGraph, check out our first LangChain Academy course, Introduction to LangGraph, available here.
How-to guidesβHere youβll find short answers to βHow do Iβ¦.?β types of questions. These how-to guides donβt cover topics in depth β youβll find that material in the Tutorials and the API Reference. However, these guides will help you quickly accomplish common tasks using chat models, vector stores, and other common LangChain components.
Check out LangGraph-specific how-tos here.
Conceptual guideβIntroductions to all the key parts of LangChain youβll need to know! Here you'll find high level explanations of all LangChain concepts.
For a deeper dive into LangGraph concepts, check out this page.
IntegrationsβLangChain is part of a rich ecosystem of tools that integrate with our framework and build on top of it. If you're looking to get up and running quickly with chat models, vector stores, or other LangChain components from a specific provider, check out our growing list of integrations.
API referenceβHead to the reference section for full documentation of all classes and methods in the LangChain Python packages.
Ecosystemβ π¦π οΈ LangSmithβTrace and evaluate your language model applications and intelligent agents to help you move from prototype to production.
π¦πΈοΈ LangGraphβBuild stateful, multi-actor applications with LLMs. Integrates smoothly with LangChain, but can be used without it. LangGraph powers production-grade agents, trusted by Linkedin, Uber, Klarna, GitLab, and many more.
Additional resourcesβ VersionsβSee what changed in v0.3, learn how to migrate legacy code, read up on our versioning policies, and more.
SecurityβRead up on security best practices to make sure you're developing safely with LangChain.
ContributingβCheck out the developer's guide for guidelines on contributing and help getting your dev environment set up.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4