@customable/issue-summarizer-mcp (1.0.5)

Published 2025-10-07 17:12:44 +02:00 by jack in customable-mcp/issue-summarizer-mcp

Installation

@customable:registry=
npm install @customable/issue-summarizer-mcp@1.0.5
"@customable/issue-summarizer-mcp": "1.0.5"

About this package

📊 issue-summarizer-mcp

MCP Server for automatic issue summarization and prioritization using AI/LLM.

Features

Core Features

  • 📝 summarize_issue - Summarize a single issue
  • 📊 summarize_all_open - Summarize all open issues in a repository
  • 🎯 prioritize_issues - Prioritize issues based on content analysis
  • 🏷️ suggest_labels - Suggest appropriate labels for issues
  • 📈 analyze_trends - Analyze issue trends and patterns

Advanced Features

  • 🏷️ Label-based clustering
  • 🗺️ Automatic roadmap generation
  • 📊 Trend analysis (recurring topics, common problems)
  • 🤖 AI-powered categorization
  • 📋 Dashboard data generation

Installation

npm install
npm run build

Configuration

Required environment variables:

  • LLM_API_URL - URL to LLM API (Ollama, OpenAI, etc.)
  • LLM_MODEL - Model name (default: llama3.2)
  • LLM_API_KEY - API key (if required)

Optional:

  • FORGEJO_URL - Forgejo instance URL
  • FORGEJO_TOKEN - Forgejo API token
  • GITHUB_TOKEN - GitHub API token
  • GITLAB_TOKEN - GitLab API token

Usage

export LLM_API_URL="http://localhost:11434"
export LLM_MODEL="llama3.2"
npm start

Examples

Summarize an issue

{
  "tool": "summarize_issue",
  "arguments": {
    "issueText": "Bug: Application crashes when uploading large files...",
    "includeContext": true
  }
}

Prioritize issues

{
  "tool": "prioritize_issues",
  "arguments": {
    "issues": [
      {"id": 1, "title": "Critical bug", "body": "..."},
      {"id": 2, "title": "Feature request", "body": "..."}
    ]
  }
}

Suggest labels

{
  "tool": "suggest_labels",
  "arguments": {
    "issueTitle": "Add dark mode support",
    "issueBody": "Users have requested dark mode..."
  }
}

LLM Integration

This MCP can work with:

  • Ollama (local): Set LLM_API_URL=http://localhost:11434
  • OpenAI: Set LLM_API_URL=https://api.openai.com/v1 and provide API key
  • Custom LLM APIs that follow OpenAI-compatible format

License

MIT

Dependencies

Dependencies

ID Version
@modelcontextprotocol/sdk ^0.5.0

Development dependencies

ID Version
@types/node ^18.0.0
typescript ^5.3.0

Keywords

mcp issues ai summarization prioritization
Details
npm
2025-10-07 17:12:44 +02:00
1
Customable Team
MIT
13 KiB
Assets (1)
Versions (2) View all
2.0.0 2025-10-10
1.0.5 2025-10-07