smolagents
is an open-source Python library designed to make it extremely easy to build and run agents using just a few lines of code.
Key features of smolagents
include:
✨ Simplicity: The logic for agents fits in ~thousand lines of code. We kept abstractions to their minimal shape above raw code!
🧑💻 First-class support for Code Agents: CodeAgent
writes its actions in code (as opposed to “agents being used to write code”) to invoke tools or perform computations, enabling natural composability (function nesting, loops, conditionals). To make it secure, we support executing in sandboxed environment via E2B or via Docker.
📡 Common Tool-Calling Agent Support: In addition to CodeAgents, ToolCallingAgent
supports usual JSON/text-based tool-calling for scenarios where that paradigm is preferred.
🤗 Hub integrations: Seamlessly share and load agents and tools to/from the Hub as Gradio Spaces.
🌐 Model-agnostic: Easily integrate any large language model (LLM), whether it’s hosted on the Hub via Inference providers, accessed via APIs such as OpenAI, Anthropic, or many others via LiteLLM integration, or run locally using Transformers or Ollama. Powering an agent with your preferred LLM is straightforward and flexible.
👁️ Modality-agnostic: Beyond text, agents can handle vision, video, and audio inputs, broadening the range of possible applications. Check out this tutorial for vision.
🛠️ Tool-agnostic: You can use tools from any MCP server, from LangChain, you can even use a Hub Space as a tool.
💻 CLI Tools: Comes with command-line utilities (smolagent, webagent) for quickly running agents without writing boilerplate code.
QuickstartGet started with smolagents in just a few minutes! This guide will show you how to create and run your first agent.
InstallationInstall smolagents with pip:
pip install smolagents[toolkit]Create Your First Agent
Here’s a minimal example to create and run an agent:
from smolagents import CodeAgent, InferenceClientModel model = InferenceClientModel() agent = CodeAgent(tools=[], model=model) result = agent.run("Calculate the sum of numbers from 1 to 10") print(result)
That’s it! Your agent will use Python code to solve the task and return the result.
Adding ToolsLet’s make our agent more capable by adding some tools:
from smolagents import CodeAgent, InferenceClientModel, DuckDuckGoSearchTool model = InferenceClientModel() agent = CodeAgent( tools=[DuckDuckGoSearchTool()], model=model, ) result = agent.run("What is the current weather in Paris?") print(result)Using Different Models
You can use various models with your agent:
model = InferenceClientModel(model_id="meta-llama/Llama-2-70b-chat-hf") from smolagents import LiteLLMModel model = LiteLLMModel(model_id="gpt-4") from smolagents import TransformersModel model = TransformersModel(model_id="meta-llama/Llama-2-7b-chat-hf")Next Steps
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4