@customable/issue-summarizer-mcp (2.0.0)
Installation
@customable:registry=npm install @customable/issue-summarizer-mcp@2.0.0"@customable/issue-summarizer-mcp": "2.0.0"About this package
📊 issue-summarizer-mcp
MCP Server for automatic issue summarization and prioritization with AI - featuring LLM integration, trend analysis, and comprehensive testing.
What's New in v2.0.0
✨ Complete Refactor - Modular architecture with 72% code reduction
🧪 19 Unit Tests - Full test coverage with Vitest
🎯 100% Type Safety - Eliminated all any types
📚 Complete JSDoc - Comprehensive API documentation
📦 MCP SDK 1.20.0 - Latest SDK with improved capabilities
🗺️ Roadmap Generation - New AI-powered roadmap tool
Features
Core Features
- 📝
summarize_issue- Summarize a single issue using AI to extract key insights, category, and suggested priority - 📊
summarize_all_open- Summarize all open issues in a repository with AI-generated summaries for each - 🎯
prioritize_issues- Prioritize issues based on AI content analysis with scores and reasoning - 🏷️
suggest_labels- Suggest appropriate labels for an issue based on AI analysis of content - 📈
analyze_trends- Analyze trends and patterns across multiple issues using AI to identify common topics and recurring problems - 🗺️
generate_roadmap- Generate a project roadmap based on issues with immediate, short-term, and long-term priorities
Advanced Features
- 🤖 AI-powered categorization and priority assignment
- 📊 Trend analysis (recurring topics, common problems)
- 🔍 Multi-issue batch processing
- 🎯 Score-based prioritization (0-100)
- 📋 Structured JSON output for all operations
Installation
npm install
npm run build
Configuration
Required environment variables:
LLM_API_URL- URL to LLM API (Ollama, OpenAI, etc.)LLM_MODEL- Model name (default: llama3.2)LLM_API_KEY- API key (if required)
Optional:
FORGEJO_URL- Forgejo instance URLFORGEJO_TOKEN- Forgejo API tokenGITHUB_TOKEN- GitHub API tokenGITLAB_TOKEN- GitLab API token
Usage
export LLM_API_URL="http://localhost:11434"
export LLM_MODEL="llama3.2"
npm start
Examples
Summarize an issue
{
"tool": "summarize_issue",
"arguments": {
"issueText": "Bug: Application crashes when uploading large files...",
"includeContext": true
}
}
Prioritize issues
{
"tool": "prioritize_issues",
"arguments": {
"issues": [
{"id": 1, "title": "Critical bug", "body": "..."},
{"id": 2, "title": "Feature request", "body": "..."}
]
}
}
Suggest labels
{
"tool": "suggest_labels",
"arguments": {
"issueTitle": "Add dark mode support",
"issueBody": "Users have requested dark mode..."
}
}
LLM Integration
This MCP can work with:
- Ollama (local): Set
LLM_API_URL=http://localhost:11434 - OpenAI: Set
LLM_API_URL=https://api.openai.com/v1and provide API key - Custom LLM APIs that follow OpenAI-compatible format
License
MIT
Dependencies
Dependencies
| ID | Version |
|---|---|
| zod | ^3.23.8 |
Development dependencies
| ID | Version |
|---|---|
| @modelcontextprotocol/sdk | ^1.20.0 |
| @types/node | ^22.0.0 |
| @vitest/coverage-v8 | ^3.2.4 |
| typescript | ^5.5.0 |
| vitest | ^3.2.4 |
Keywords
mcp
issues
ai
summarization
prioritization
llm
ollama
openai
mcp-server
Details
2025-10-10 12:50:17 +02:00
Assets (1)
Versions (2)
View all
npm
1
Customable Team
MIT
latest
20 KiB