A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://block.github.io/goose/docs/guides/environment-variables below:

Environment Variables | codename goose

Environment Variables

Goose supports various environment variables that allow you to customize its behavior. This guide provides a comprehensive list of available environment variables grouped by their functionality.

Model Configuration

These variables control the language models and their behavior.

Basic Provider Configuration

These are the minimum required variables to get started with Goose.

Variable Purpose Values Default GOOSE_PROVIDER Specifies the LLM provider to use See available providers None (must be configured) GOOSE_MODEL Specifies which model to use from the provider Model name (e.g., "gpt-4", "claude-3.5-sonnet") None (must be configured) GOOSE_TEMPERATURE Sets the temperature for model responses Float between 0.0 and 1.0 Model-specific default

Examples

# Basic model configuration
export GOOSE_PROVIDER="anthropic"
export GOOSE_MODEL="claude-3.5-sonnet"
export GOOSE_TEMPERATURE=0.7
Advanced Provider Configuration

These variables are needed when using custom endpoints, enterprise deployments, or specific provider implementations.

Variable Purpose Values Default GOOSE_PROVIDER__TYPE The specific type/implementation of the provider See available providers Derived from GOOSE_PROVIDER GOOSE_PROVIDER__HOST Custom API endpoint for the provider URL (e.g., "https://api.openai.com") Provider-specific default GOOSE_PROVIDER__API_KEY Authentication key for the provider API key string None

Examples

# Advanced provider configuration
export GOOSE_PROVIDER__TYPE="anthropic"
export GOOSE_PROVIDER__HOST="https://api.anthropic.com"
export GOOSE_PROVIDER__API_KEY="your-api-key-here"
Lead/Worker Model Configuration

These variables configure a lead/worker model pattern where a powerful lead model handles initial planning and complex reasoning, then switches to a faster/cheaper worker model for execution. The switch happens automatically based on your settings.

Variable Purpose Values Default GOOSE_LEAD_MODEL Required to enable lead mode. Name of the lead model Model name (e.g., "gpt-4o", "claude-3.5-sonnet") None GOOSE_LEAD_PROVIDER Provider for the lead model See available providers Falls back to GOOSE_PROVIDER GOOSE_LEAD_TURNS Number of initial turns using the lead model before switching to the worker model Integer 3 GOOSE_LEAD_FAILURE_THRESHOLD Consecutive failures before fallback to the lead model Integer 2 GOOSE_LEAD_FALLBACK_TURNS Number of turns to use the lead model in fallback mode Integer 2

A turn is one complete prompt-response interaction. Here's how it works with the default settings:

The lead model and worker model names are displayed at the start of the Goose CLI session. If you don't export a GOOSE_MODEL for your session, the worker model defaults to the GOOSE_MODEL in your configuration file.

Examples

# Basic lead/worker setup
export GOOSE_LEAD_MODEL="o4"

# Advanced lead/worker configuration
export GOOSE_LEAD_MODEL="claude4-opus"
export GOOSE_LEAD_PROVIDER="anthropic"
export GOOSE_LEAD_TURNS=5
export GOOSE_LEAD_FAILURE_THRESHOLD=3
export GOOSE_LEAD_FALLBACK_TURNS=2
Planning Mode Configuration

These variables control Goose's planning functionality.

Variable Purpose Values Default GOOSE_PLANNER_PROVIDER Specifies which provider to use for planning mode See available providers Falls back to GOOSE_PROVIDER GOOSE_PLANNER_MODEL Specifies which model to use for planning mode Model name (e.g., "gpt-4", "claude-3.5-sonnet") Falls back to GOOSE_MODEL

Examples

# Planning mode with different model
export GOOSE_PLANNER_PROVIDER="openai"
export GOOSE_PLANNER_MODEL="gpt-4"
Session Management

These variables control how Goose manages conversation sessions and context.

Variable Purpose Values Default GOOSE_CONTEXT_STRATEGY Controls how Goose handles context limit exceeded situations "summarize", "truncate", "clear", "prompt" "prompt" (interactive), "summarize" (headless) GOOSE_MAX_TURNS Maximum number of turns allowed without user input Integer (e.g., 10, 50, 100) 1000 CONTEXT_FILE_NAMES Specifies custom filenames for hint/context files JSON array of strings (e.g., ["CLAUDE.md", ".goosehints"]) [".goosehints"] GOOSE_CLI_THEME Theme for CLI response markdown "light", "dark", "ansi" "dark" GOOSE_SCHEDULER_TYPE Controls which scheduler Goose uses for scheduled recipes "legacy" or "temporal" "legacy" (Goose's built-in cron scheduler) GOOSE_TEMPORAL_BIN Optional custom path to your Temporal binary /path/to/temporal-service None GOOSE_RANDOM_THINKING_MESSAGES Controls whether to show amusing random messages during processing "true", "false" "true" GOOSE_CLI_SHOW_COST Toggles display of model cost estimates in CLI output "true", "1" (case insensitive) to enable false

Examples

# Automatically summarize when context limit is reached
export GOOSE_CONTEXT_STRATEGY=summarize

# Always prompt user to choose (default for interactive mode)
export GOOSE_CONTEXT_STRATEGY=prompt

# Set a low limit for step-by-step control
export GOOSE_MAX_TURNS=5

# Set a moderate limit for controlled automation
export GOOSE_MAX_TURNS=25

# Set a reasonable limit for production
export GOOSE_MAX_TURNS=100

# Use multiple context files
export CONTEXT_FILE_NAMES='["CLAUDE.md", ".goosehints", "project_rules.txt"]'

# Set the ANSI theme for the session
export GOOSE_CLI_THEME=ansi

# Use Temporal for scheduled recipes
export GOOSE_SCHEDULER_TYPE=temporal

# Custom Temporal binary (optional)
export GOOSE_TEMPORAL_BIN=/path/to/temporal-service

# Disable random thinking messages for less distraction
export GOOSE_RANDOM_THINKING_MESSAGES=false

# Enable model cost display in CLI
export GOOSE_CLI_SHOW_COST=true
Model Context Limit Overrides

These variables allow you to override the default context window size (token limit) for your models. This is particularly useful when using LiteLLM proxies or custom models that don't match Goose's predefined model patterns.

Variable Purpose Values Default GOOSE_CONTEXT_LIMIT Override context limit for the main model Integer (number of tokens) Model-specific default or 128,000 GOOSE_LEAD_CONTEXT_LIMIT Override context limit for the lead model in lead/worker mode Integer (number of tokens) Falls back to GOOSE_CONTEXT_LIMIT or model default GOOSE_WORKER_CONTEXT_LIMIT Override context limit for the worker model in lead/worker mode Integer (number of tokens) Falls back to GOOSE_CONTEXT_LIMIT or model default GOOSE_PLANNER_CONTEXT_LIMIT Override context limit for the planner model Integer (number of tokens) Falls back to GOOSE_CONTEXT_LIMIT or model default

Examples

# Set context limit for main model (useful for LiteLLM proxies)
export GOOSE_CONTEXT_LIMIT=200000

# Set different context limits for lead/worker models
export GOOSE_LEAD_CONTEXT_LIMIT=500000 # Large context for planning
export GOOSE_WORKER_CONTEXT_LIMIT=128000 # Smaller context for execution

# Set context limit for planner
export GOOSE_PLANNER_CONTEXT_LIMIT=1000000

For more details and examples, see Model Context Limit Overrides.

These variables control how Goose handles tool execution and tool management.

Variable Purpose Values Default GOOSE_MODE Controls how Goose handles tool execution "auto", "approve", "chat", "smart_approve" "smart_approve" GOOSE_TOOLSHIM Enables/disables tool call interpretation "1", "true" (case insensitive) to enable false GOOSE_TOOLSHIM_OLLAMA_MODEL Specifies the model for tool call interpretation Model name (e.g. llama3.2, qwen2.5) System default GOOSE_CLI_MIN_PRIORITY Controls verbosity of tool output Float between 0.0 and 1.0 0.0 GOOSE_CLI_TOOL_PARAMS_TRUNCATION_MAX_LENGTH Maximum length for tool parameter values before truncation in CLI output (not in debug mode) Integer 40

Examples

# Enable tool interpretation
export GOOSE_TOOLSHIM=true
export GOOSE_TOOLSHIM_OLLAMA_MODEL=llama3.2
export GOOSE_MODE="auto"
export GOOSE_CLI_MIN_PRIORITY=0.2 # Show only medium and high importance output
export GOOSE_CLI_TOOL_PARAMS_MAX_LENGTH=100 # Show up to 100 characters for tool parameters in CLI output
Enhanced Code Editing

These variables configure AI-powered code editing for the Developer extension's str_replace tool. All three variables must be set and non-empty for the feature to activate.

Variable Purpose Values Default GOOSE_EDITOR_API_KEY API key for the code editing model API key string None GOOSE_EDITOR_HOST API endpoint for the code editing model URL (e.g., "https://api.openai.com/v1") None GOOSE_EDITOR_MODEL Model to use for code editing Model name (e.g., "gpt-4o", "claude-3-5-sonnet") None

Examples

This feature works with any OpenAI-compatible API endpoint, for example:

# OpenAI configuration
export GOOSE_EDITOR_API_KEY="sk-..."
export GOOSE_EDITOR_HOST="https://api.openai.com/v1"
export GOOSE_EDITOR_MODEL="gpt-4o"

# Anthropic configuration (via OpenAI-compatible proxy)
export GOOSE_EDITOR_API_KEY="sk-ant-..."
export GOOSE_EDITOR_HOST="https://api.anthropic.com/v1"
export GOOSE_EDITOR_MODEL="claude-3-5-sonnet-20241022"

# Local model configuration
export GOOSE_EDITOR_API_KEY="your-key"
export GOOSE_EDITOR_HOST="http://localhost:8000/v1"
export GOOSE_EDITOR_MODEL="your-model"
Security Configuration

These variables control security related features.

Variable Purpose Values Default GOOSE_ALLOWLIST Controls which extensions can be loaded URL for allowed extensions list Unset GOOSE_DISABLE_KEYRING Disables the system keyring for secret storage Set to any value (e.g., "1", "true", "yes") to disable. The actual value doesn't matter, only whether the variable is set. Unset (keyring enabled)

tip

When the keyring is disabled, secrets are stored here:

Langfuse Integration

These variables configure the Langfuse integration for observability.

Variable Purpose Values Default LANGFUSE_PUBLIC_KEY Public key for Langfuse integration String None LANGFUSE_SECRET_KEY Secret key for Langfuse integration String None LANGFUSE_URL Custom URL for Langfuse service URL String Default Langfuse URL LANGFUSE_INIT_PROJECT_PUBLIC_KEY Alternative public key for Langfuse String None LANGFUSE_INIT_PROJECT_SECRET_KEY Alternative secret key for Langfuse String None Experimental Features

These variables enable experimental features that are in active development. These may change or be removed in future releases. Use with caution in production environments.

Variable Purpose Values Default ALPHA_FEATURES Enables experimental alpha features like subagents "true", "1" (case insensitive) to enable false

Examples

# Enable alpha features
export ALPHA_FEATURES=true

# Or enable for a single session
ALPHA_FEATURES=true goose session
Notes

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4