A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/oxylabs/oxylabs-mcp below:

oxylabs/oxylabs-mcp: Official Oxylabs MCP integration

The missing link between AI models and the real‑world web: one API that delivers clean, structured data from any site.

The Oxylabs MCP server provides a bridge between AI models and the web. It enables them to scrape any URL, render JavaScript-heavy pages, extract and format content for AI use, bypass anti-scraping measures, and access geo-restricted web data from 195+ countries.

This implementation leverages the Model Context Protocol (MCP) to create a secure, standardized way for AI assistants to interact with web content.

Why Oxylabs MCP?  🕸️ ➜ 📦 ➜ 🤖

Imagine telling your LLM "Summarise the latest Hacker News discussion about GPT‑7" – and it simply answers.
MCP (Multi‑Client Proxy) makes that happen by doing the boring parts for you:

What Oxylabs MCP does Why it matters to you Bypasses anti‑bot walls with the Oxylabs global proxy network Keeps you unblocked and anonymous Renders JavaScript in headless Chrome Single‑page apps, sorted Cleans HTML → JSON Drop straight into vector DBs or prompts Optional structured parsers (Google, Amazon, etc.) One‑line access to popular targets Scrape content from any site
Automatically get AI-ready data
Bypass blocks & geo-restrictions
Flexible setup & cross-platform support
Built-in error handling and request management

Oxylabs MCP provides two sets of tools that can be used together or independently:

Oxylabs Web Scraper API Tools
  1. universal_scraper: Uses Oxylabs Web Scraper API for general website scraping.
  2. google_search_scraper: Uses Oxylabs Web Scraper API to extract results from Google Search.
  3. amazon_search_scraper: Uses Oxylabs Web Scraper API to scrape Amazon search result pages.
  4. amazon_product_scraper: Uses Oxylabs Web Scraper API to extract data from individual Amazon product pages.

The Oxylabs AI Studio MCP server provides various AI tools for your agents:

  1. ai_scraper: Scrape content from any URL in JSON or Markdown format with AI-powered data extraction.
  2. ai_crawler: Based on a prompt, crawls a website and collects data in Markdown or JSON format across multiple pages.
  3. ai_browser_agent: Given a task, the agent controls a browser to achieve the given objective and returns data in Markdown, JSON, HTML, or screenshot formats.
  4. ai_search: Search the web for URLs and their contents with AI-powered content extraction.

When you've set up the MCP server with Claude, you can make requests like:

Before you begin, make sure you have:

Via Smithery CLI:

Via uv:

The Oxylabs MCP Universal Scraper accepts these parameters:

Parameter Description Values url The URL to scrape Any valid URL render Use headless browser rendering html or None geo_location Sets the proxy's geo location to retrieve data. Brasil, Canada, etc. user_agent_type Device type and browser desktop, tablet, etc. output_format The format of the output links, md, html

smithery
  1. Go to https://smithery.ai/server/@oxylabs/oxylabs-mcp
  2. Login with GitHub
  3. Find the Install section
  4. Follow the instructions to generate the config

Auto install with Smithery CLI

# example for Claude Desktop
npx -y @smithery/cli@latest install @upstash/context7-mcp --client claude --key <smithery_key>
uvx
  1. Install the uv
# macOS and Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
  1. Use the following config
{
  "mcpServers": {
    "oxylabs": {
      "command": "uvx",
      "args": ["oxylabs-mcp"],
      "env": {
        "OXYLABS_USERNAME": "OXYLABS_USERNAME",
        "OXYLABS_PASSWORD": "OXYLABS_PASSWORD",
        "OXYLABS_AI_STUDIO_API_KEY": "OXYLABS_AI_STUDIO_API_KEY"
      }
    }
  }
}
uv
  1. Install the uvx
# macOS and Linux
curl -LsSf https://astral.sh/uv/install.sh | sh

# Windows
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
  1. Use the following config
{
  "mcpServers": {
    "oxylabs": {
      "command": "uv",
      "args": [
        "--directory",
        "/<Absolute-path-to-folder>/oxylabs-mcp",
        "run",
        "oxylabs-mcp"
      ],
      "env": {
        "OXYLABS_USERNAME": "OXYLABS_USERNAME",
        "OXYLABS_PASSWORD": "OXYLABS_PASSWORD",
        "OXYLABS_AI_STUDIO_API_KEY": "OXYLABS_AI_STUDIO_API_KEY"
      }
    }
  }
}
Manual Setup with Claude Desktop

Navigate to Claude → Settings → Developer → Edit Config and add one of the configurations above to the claude_desktop_config.json file.

Manual Setup with Cursor AI

Navigate to Cursor → Settings → Cursor Settings → MCP. Click Add new global MCP server and add one of the configurations above.

Oxylabs MCP server supports the following environment variables

Name Description Default OXYLABS_USERNAME Your Oxylabs Web Scraper API username OXYLABS_PASSWORD Your Oxylabs Web Scraper API password OXYLABS_AI_STUDIO_API_KEY Your Oxylabs AI Studio API key LOG_LEVEL Log level for the logs returned to the client INFO

*At least one set of credentials (Web Scraper API or AI Studio) is required to use the MCP server.

The Oxylabs MCP server supports two independent services:

You can use either service independently or both together. The server will automatically detect which credentials are available and enable the corresponding tools.

Server provides additional information about the tool calls in notification/message events

{
  "method": "notifications/message",
  "params": {
    "level": "info",
    "data": "Create job with params: {\"url\": \"https://ip.oxylabs.io\"}"
  }
}
{
  "method": "notifications/message",
  "params": {
    "level": "info",
    "data": "Job info: job_id=7333113830223918081 job_status=done"
  }
}
{
  "method": "notifications/message",
  "params": {
    "level": "error",
    "data": "Error: request to Oxylabs API failed"
  }
}

Distributed under the MIT License – see LICENSE for details.

Established in 2015, Oxylabs is a market-leading web intelligence collection platform, driven by the highest business, ethics, and compliance standards, enabling companies worldwide to unlock data-driven insights.

Made with ☕ by Oxylabs. Feel free to give us a ⭐ if MCP saved you a weekend.

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4