This will help you get started with Outlines LLM. For detailed documentation of all Outlines features and configurations head to the API reference.
Outlines is a library for constrained language generation. It allows you to use large language models (LLMs) with various backends while applying constraints to the generated output.
Overview Integration details SetupTo access Outlines models you'll need to have an internet connection to download the model weights from huggingface. Depending on the backend you need to install the required dependencies (see Outlines docs)
CredentialsThere is no built-in auth mechanism for Outlines.
InstallationThe LangChain Outlines integration lives in the langchain-community
package and requires the outlines
library:
%pip install -qU langchain-community outlines
Instantiation
Now we can instantiate our model object and generate chat completions:
from langchain_community.llms import Outlines
model = Outlines(model="microsoft/Phi-3-mini-4k-instruct", backend="llamacpp")
model = Outlines(model="microsoft/Phi-3-mini-4k-instruct", backend="vllm")
model = Outlines(model="microsoft/Phi-3-mini-4k-instruct", backend="mlxlm")
model = Outlines(
model="microsoft/Phi-3-mini-4k-instruct"
)
Invocation
model.invoke("Hello how are you?")
Chaining
from langchain_core.prompts import PromptTemplate
prompt = PromptTemplate.from_template("How to say {input} in {output_language}:\n")
chain = prompt | model
chain.invoke(
{
"output_language": "German",
"input": "I love programming.",
}
)
Streaming
Outlines supports streaming of tokens:
for chunk in model.stream("Count to 10 in French:"):
print(chunk, end="", flush=True)
Constrained Generation
Outlines allows you to apply various constraints to the generated output:
Regex Constraintmodel.regex = r"((25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(25[0-5]|2[0-4]\d|[01]?\d\d?)"
response = model.invoke("What is the IP address of Google's DNS server?")
response
Type Constraints
model.type_constraints = int
response = model.invoke("What is the answer to life, the universe, and everything?")
JSON Schema
from pydantic import BaseModel
class Person(BaseModel):
name: str
model.json_schema = Person
response = model.invoke("Who is the author of LangChain?")
person = Person.model_validate_json(response)
person
Grammar Constraint
model.grammar = """
?start: expression
?expression: term (("+" | "-") term)
?term: factor (("" | "/") factor)
?factor: NUMBER | "-" factor | "(" expression ")"
%import common.NUMBER
%import common.WS
%ignore WS
"""
response = model.invoke("Give me a complex arithmetic expression:")
response
API reference
For detailed documentation of all ChatOutlines features and configurations head to the API reference: https://python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.outlines.ChatOutlines.html
Outlines Documentation:https://dottxt-ai.github.io/outlines/latest/
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4