@customable/issue-summarizer-mcp (2.0.0)

Published 2025-10-10 12:50:17 +02:00 by jack in customable-mcp/issue-summarizer-mcp

Installation

@customable:registry=
npm install @customable/issue-summarizer-mcp@2.0.0
"@customable/issue-summarizer-mcp": "2.0.0"

About this package

📊 issue-summarizer-mcp

Version CI Status License NPM

MCP Server for automatic issue summarization and prioritization with AI - featuring LLM integration, trend analysis, and comprehensive testing.

What's New in v2.0.0

Complete Refactor - Modular architecture with 72% code reduction 🧪 19 Unit Tests - Full test coverage with Vitest 🎯 100% Type Safety - Eliminated all any types 📚 Complete JSDoc - Comprehensive API documentation 📦 MCP SDK 1.20.0 - Latest SDK with improved capabilities 🗺️ Roadmap Generation - New AI-powered roadmap tool

Features

Core Features

  • 📝 summarize_issue - Summarize a single issue using AI to extract key insights, category, and suggested priority
  • 📊 summarize_all_open - Summarize all open issues in a repository with AI-generated summaries for each
  • 🎯 prioritize_issues - Prioritize issues based on AI content analysis with scores and reasoning
  • 🏷️ suggest_labels - Suggest appropriate labels for an issue based on AI analysis of content
  • 📈 analyze_trends - Analyze trends and patterns across multiple issues using AI to identify common topics and recurring problems
  • 🗺️ generate_roadmap - Generate a project roadmap based on issues with immediate, short-term, and long-term priorities

Advanced Features

  • 🤖 AI-powered categorization and priority assignment
  • 📊 Trend analysis (recurring topics, common problems)
  • 🔍 Multi-issue batch processing
  • 🎯 Score-based prioritization (0-100)
  • 📋 Structured JSON output for all operations

Installation

npm install
npm run build

Configuration

Required environment variables:

  • LLM_API_URL - URL to LLM API (Ollama, OpenAI, etc.)
  • LLM_MODEL - Model name (default: llama3.2)
  • LLM_API_KEY - API key (if required)

Optional:

  • FORGEJO_URL - Forgejo instance URL
  • FORGEJO_TOKEN - Forgejo API token
  • GITHUB_TOKEN - GitHub API token
  • GITLAB_TOKEN - GitLab API token

Usage

export LLM_API_URL="http://localhost:11434"
export LLM_MODEL="llama3.2"
npm start

Examples

Summarize an issue

{
  "tool": "summarize_issue",
  "arguments": {
    "issueText": "Bug: Application crashes when uploading large files...",
    "includeContext": true
  }
}

Prioritize issues

{
  "tool": "prioritize_issues",
  "arguments": {
    "issues": [
      {"id": 1, "title": "Critical bug", "body": "..."},
      {"id": 2, "title": "Feature request", "body": "..."}
    ]
  }
}

Suggest labels

{
  "tool": "suggest_labels",
  "arguments": {
    "issueTitle": "Add dark mode support",
    "issueBody": "Users have requested dark mode..."
  }
}

LLM Integration

This MCP can work with:

  • Ollama (local): Set LLM_API_URL=http://localhost:11434
  • OpenAI: Set LLM_API_URL=https://api.openai.com/v1 and provide API key
  • Custom LLM APIs that follow OpenAI-compatible format

License

MIT

Dependencies

Dependencies

ID Version
zod ^3.23.8

Development dependencies

ID Version
@modelcontextprotocol/sdk ^1.20.0
@types/node ^22.0.0
@vitest/coverage-v8 ^3.2.4
typescript ^5.5.0
vitest ^3.2.4

Keywords

mcp issues ai summarization prioritization llm ollama openai mcp-server
Details
npm
2025-10-10 12:50:17 +02:00
2
Customable Team
MIT
latest
20 KiB
Assets (1)
Versions (2) View all
2.0.0 2025-10-10
1.0.5 2025-10-07