A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/superstreamlabs/kafka-analyzer below:

superstreamlabs/kafka-analyzer: Interactive CLI for analyzing Kafka health and configuration according to best practices and industry standards.

Superstream Kafka Analyzer

Interactive CLI for analyzing Kafka health and configuration according to best practices and industry standards.

Made with ❀️ by the Superstream Team

No installation required! Run directly with npx:

npm install -g superstream-kafka-analyzer
# Interactive mode (recommended for first-time users)
npx superstream-kafka-analyzer
# Using a configuration file
npx superstream-kafka-analyzer --config config.json
Configuration File Examples

Available Examples: The full list is under the ./config-examples/ folder:

Basic Configuration (config.example.json):

{
  "kafka": {
    "bootstrap_servers": "localhost:9092",
    "clientId": "superstream-analyzer",
    "vendor": "apache",
    "useSasl": false
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["html"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}

SASL Authentication (config.example.sasl.json):

{
  "kafka": {
    "bootstrap_servers": ["kafka1.example.com:9092", "kafka2.example.com:9092", "kafka3.example.com:9092"],
    "clientId": "superstream-analyzer",
    "vendor": "apache",
    "useSasl": true,
    "sasl": {
      "mechanism": "PLAIN",
      "username": "your-username",
      "password": "your-password"
    }
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["html"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}

AWS MSK with SCRAM (config.example.aws-msk.json):

{
  "kafka": {
    "bootstrap_servers": ["b-1.your-cluster.abc123.c2.kafka.us-east-1.amazonaws.com:9092"],
    "clientId": "superstream-analyzer",
    "vendor": "aws-msk",
    "useSasl": true,
    "sasl": {
      "mechanism": "SCRAM-SHA-512",
      "username": "your-msk-username",
      "password": "your-msk-password"
    }
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["html"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}

AWS MSK with IAM (config.example.aws-msk-iam.json):

{
  "kafka": {
    "bootstrap_servers": ["b-1.your-cluster.abc123.c2.kafka.us-east-1.amazonaws.com:9198"],
    "clientId": "superstream-analyzer",
    "vendor": "aws-msk",
    "useSasl": true,
    "sasl": {
      "mechanism": "oauthbearer"
    }
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["html"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}

Confluent Cloud (config.example.confluent-cloud.json):

{
  "kafka": {
    "brokers": ["pkc-xxxxx.region.cloud:9092"],
    "clientId": "superstream-analyzer",
    "vendor": "confluent-cloud",
    "useSasl": true,
    "sasl": {
      "mechanism": "PLAIN",
      "username": "your-api-key",
      "password": "your-api-secret"
    }
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["html"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}

Note: Confluent Cloud connections now use the official Confluent Cloud methodology with @confluentinc/kafka-javascript library, SASL_SSL protocol, and PLAIN mechanism as recommended by Confluent.

Aiven Kafka (config.example.aiven-kafka.json):

{
  "kafka": {
    "brokers": ["kafka-xxxxx-aiven-kafka.aivencloud.com:12345"],
    "clientId": "superstream-analyzer",
    "vendor": "aiven",
    "useSasl": false,
    "ssl": {
      "ca": "path/to/ca.pem",
      "cert": "path/to/service.cert",
      "key": "path/to/service.key"
    }
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["json", "csv", "html", "txt"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}
{
  "kafka": {
    "bootstrap_servers": ["your-aiven-cluster.aivencloud.com:12345"],
    "clientId": "superstream-analyzer",
    "vendor": "aiven",
    "useSasl": true,
    "sasl": {
      "mechanism": "oauthbearer",
      "clientId": "your-client-id",
      "clientSecret": "your-client-secret",
      "host": "https://my-oauth-server.com",
      "path": "/oauth/token",
    }
  },
  "file": {
    "outputDir": "./kafka-analysis",
    "formats": ["html"],
    "includeMetadata": true,
    "includeTimestamp": true
  },
  "email": "user@example.com"
}
Option Description Default --config <path> Path to configuration file -
# Default for local development
npx superstream-kafka-analyzer
# Configure bootstrap servers as: localhost:9092
# With SASL credentials
npx superstream-kafka-analyzer
# Configure SASL mechanism and credentials when prompted
OIDC Authentication (OpenID Connect)

The analyzer supports modern OIDC authentication with any OIDC-compliant identity provider including Azure AD, Keycloak, Okta, Auth0, and others.

# With OIDC authentication
npx superstream-kafka-analyzer --config config-oidc.json

Key Features:

Quick Example:

{
  "kafka": {
    "brokers": ["kafka.example.com:9093"],
    "vendor": "oidc",
    "useSasl": true,
    "sasl": {
      "mechanism": "oauthbearer",
      "discoveryUrl": "https://auth.example.com/.well-known/openid-configuration",
      "clientId": "your-client-id",
      "clientSecret": "your-client-secret",
      "scope": "openid kafka:read",
      "grantType": "client_credentials"
    }
  }
}

πŸ“š For detailed OIDC setup instructions, see:

The tool generates comprehensive reports including:

Complete structured data including all cluster and topic information.

πŸ“„ View Example JSON Report

Tabular data for easy analysis in spreadsheet applications.

Beautiful formatted report with responsive design and styling.

πŸ“„ View Example HTML Report

Simple text summary for quick review.

πŸ“„ View Example TXT Report

The tool performs comprehensive health checks on your Kafka cluster to identify potential issues and provide recommendations:

Confluent Cloud Health Checks Aiven Kafka Health Checks Generic Kafka Health Checks

The tool performs comprehensive validation in multiple phases:

Phase 1: Input Format Validation Phase 2: Network Connectivity Testing Phase 3: Security Protocol Testing Phase 4: Complete Setup Validation
kafka-analysis/
β”œβ”€β”€ analysis-2024-01-15-14-30-25.json
β”œβ”€β”€ analysis-2024-01-15-14-30-25.csv
β”œβ”€β”€ analysis-2024-01-15-14-30-25.html
└── analysis-2024-01-15-14-30-25.txt
superstream-analyzer/
β”œβ”€β”€ bin/
β”‚   └── index.js          # CLI entry point
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ cli.js            # Main CLI logic
β”‚   β”œβ”€β”€ kafka-client.js   # Kafka connection and analysis
β”‚   β”œβ”€β”€ file-service.js   # File output handling
β”‚   β”œβ”€β”€ validators.js     # Validation framework
β”‚   └── utils.js          # Utility functions
β”œβ”€β”€ config.example.json   # Basic configuration example
β”œβ”€β”€ config.example.sasl.json # SASL configuration example
└── package.json
# Clone and install dependencies
git clone <repository>
cd superstream-analyzer
npm install

# Run in development mode
npm run dev

# Test with local Kafka
npm run test:local
# Test with local Kafka cluster
npx . --config config.example.json

# Test with SASL authentication
npx . --config config.example.sasl.json

The tool includes comprehensive validation that will:

πŸ“ Configuration Reference Field Type Required Description bootstrap_servers string Yes Comma-separated list of Kafka bootstrap servers clientId string Yes Client identifier for Kafka connection vendor string No Kafka vendor (aws-msk, confluent-cloud, aiven, etc.) useSasl boolean No Enable SASL authentication sasl.mechanism string No* SASL mechanism (PLAIN, SCRAM-SHA-256, SCRAM-SHA-512) sasl.username string No* SASL username sasl.password string No* SASL password

*Required if useSasl is true

Field Type Required Description outputDir string Yes Directory for output files formats array Yes Array of output formats (json, csv, html, txt) includeMetadata boolean No Include metadata in output files Field Type Required Description email string No Email address for generating report files. If not provided, no file output will be generated

Missing Vendor Field Error

AWS MSK IAM Authentication Failed

Connection Timeout

Authentication Failed

File System Errors

Validation Errors

  1. Run with verbose logging to see detailed error information
  2. Check the validation logs for specific failure points
  3. Verify your configuration file format matches the examples
  4. Ensure your Kafka cluster is running and accessible

This project is licensed under the MIT License - see the LICENSE file for details.

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

For issues and questions:

βœ… Health/Configuration Checks

SuperStream Kafka Analyzer performs a comprehensive set of health checks on your Kafka cluster to help you identify issues and optimize your setup:

Each check provides a clear status (βœ… Pass, ⚠️ Warning, ❌ Failed, ℹ️ Info) and actionable recommendations.

Your security and privacy are our top priority. Everything runs locally and securely by default.

To perform all health checks, your user/service account must have the following permissions for each vendor:

Apache Kafka / Confluent Platform / Redpanda

Note:


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4