A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/GreatScottyMac/context-portal below:

GreatScottyMac/context-portal: Context Portal (ConPort): A memory bank MCP server building a project-specific knowledge graph to supercharge AI assistants. Enables powerful Retrieval Augmented Generation (RAG) for context-aware development in your IDE.


Context Portal MCP (ConPort)

           

A database-backed Model Context Protocol (MCP) server for managing structured project context, designed to be used by AI assistants and developer tools within IDEs and other interfaces.


What is Context Portal MCP server (ConPort)?

Context Portal (ConPort) is your project's memory bank. It's a tool that helps AI assistants understand your specific software project better by storing important information like decisions, tasks, and architectural patterns in a structured way. Think of it as building a project-specific knowledge base that the AI can easily access and use to give you more accurate and helpful responses.

What it does:

ConPort provides a robust and structured way for AI assistants to store, retrieve, and manage various types of project context. It effectively builds a project-specific knowledge graph, capturing entities like decisions, progress, and architecture, along with their relationships. This structured knowledge base, enhanced by vector embeddings for semantic search, then serves as a powerful backend for Retrieval Augmented Generation (RAG), enabling AI assistants to access precise, up-to-date information for more context-aware and accurate responses.

It replaces older file-based context management systems by offering a more reliable and queryable database backend (SQLite per workspace). ConPort is designed to be a generic context backend, compatible with various IDEs and client interfaces that support MCP.

Key features include:

Before you begin, ensure you have the following installed:

Installation and Configuration (Recommended)

The recommended way to install and run ConPort is by using uvx to execute the package directly from PyPI. This method avoids the need to manually create and manage virtual environments.

In your MCP client settings (e.g., mcp_settings.json), use the following configuration:

{
  "mcpServers": {
    "conport": {
      "command": "uvx",
      "args": [
        "--from",
        "context-portal-mcp",
        "conport-mcp",
        "--mode",
        "stdio",
        "--workspace_id",
        "${workspaceFolder}",
        "--log-file",
        "./logs/conport.log",
        "--log-level",
        "INFO"
      ]
    }
  }
}

Installation for Developers (from Git Repository)

These instructions guide you through setting up ConPort for development or contribution by cloning its Git repository and installing dependencies.

  1. Clone the Repository: Open your terminal or command prompt and run:

    git clone https://github.com/GreatScottyMac/context-portal.git
    cd context-portal
  2. Create and Activate a Virtual Environment: In the context-portal directory:

    Activate the environment:

  3. Install Dependencies: With your virtual environment activated:

    uv pip install -r requirements.txt
  4. Verify Installation (Optional): Ensure your virtual environment is activated.

    uv run python src/context_portal_mcp/main.py --help

    This should output the command-line help for the ConPort server.

Purpose of the --workspace_id Command-Line Argument:

When you launch the ConPort server, particularly in STDIO mode (--mode stdio), the --workspace_id argument serves several key purposes:

  1. Initial Server Context: It provides the server process with the absolute path to the project workspace it should initially be associated with.
  2. Critical Safety Check: In STDIO mode, this path is used to perform a vital check that prevents the server from mistakenly creating its database files (context.db, conport_vector_data/) inside its own installation directory. This protects against misconfigurations where the client might not correctly provide the workspace path.
  3. Client Launch Signal: It's the standard way for an MCP client (like an IDE extension) to signal to the server which project it is launching for.

Important Note: The --workspace_id provided at server startup is not automatically used as the workspace_id parameter for every subsequent MCP tool call. ConPort tools are designed to require the workspace_id parameter explicitly in each call (e.g., get_product_context({"workspace_id": "..."})). This design supports the possibility of a single server instance managing multiple workspaces and ensures clarity for each operation. Your client IDE/MCP client is responsible for providing the correct workspace_id with each tool call.

Key Takeaway: ConPort critically relies on an accurate --workspace_id to identify the target project. Ensure this argument correctly resolves to the absolute path of your project workspace, either through IDE variables like ${workspaceFolder} or by providing a direct absolute path.

For pre-upgrade cleanup, including clearing Python bytecode cache, please refer to the v0.2.4_UPDATE_GUIDE.md.

Usage with LLM Agents (Custom Instructions)

ConPort's effectiveness with LLM agents is significantly enhanced by providing specific custom instructions or system prompts to the LLM. This repository includes tailored strategy files for different environments:

How to Use These Strategy Files:

  1. Identify the strategy file relevant to your LLM agent's environment.
  2. Copy the entire content of that file.
  3. Paste it into your LLM's custom instructions or system prompt area. The method varies by LLM platform (IDE extension settings, web UI, API configuration).

These instructions equip the LLM with the knowledge to:

Initial ConPort Usage in a Workspace

When you first start using ConPort in a new or existing project workspace, the ConPort database (context_portal/context.db) will be automatically created by the server if it doesn't exist. To help bootstrap the initial project context, especially the Product Context, consider the following:

Using a projectBrief.md File (Recommended)
  1. Create projectBrief.md: In the root directory of your project workspace, create a file named projectBrief.md.
  2. Add Content: Populate this file with a high-level overview of your project. This could include:
  3. Automatic Prompt for Import: When an LLM agent using one of the provided ConPort custom instruction sets (e.g., roo_code_conport_strategy) initializes in the workspace, it is designed to:

If projectBrief.md is not found, or if you choose not to import it:

By providing initial context, either through projectBrief.md or manual entry, you enable ConPort and the connected LLM agent to have a better foundational understanding of your project from the start.

The ConPort server exposes the following tools via MCP, allowing interaction with the underlying project knowledge graph. This includes tools for semantic search powered by vector data storage. These tools facilitate the Retrieval aspect crucial for Augmented Generation (RAG) by AI agents. All tools require a workspace_id argument (string, required) to specify the target project workspace.

For a more in-depth understanding of ConPort's design, architecture, and advanced usage patterns, please refer to:

Please see our CONTRIBUTING.md guide for details on how to contribute to the ConPort project.

This project is licensed under the Apache-2.0 license.

Database Migration & Update Guide

For detailed instructions on how to manage your context.db file, especially when updating ConPort across versions that include database schema changes, please refer to the dedicated v0.2.4_UPDATE_GUIDE.md. This guide provides steps for manual data migration (export/import) if needed, and troubleshooting tips.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4