This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
This is a Slack MCP (Model Context Protocol) Client written in Go that serves as a bridge between Slack and multiple MCP servers. It allows LLM models to interact with various tools through a unified Slack interface.
- Slack Bot Client: Handles Slack integration via Socket Mode
- MCP Client Manager: Manages connections to multiple MCP servers (HTTP/SSE, stdio)
- LLM Provider Registry: Supports OpenAI, Anthropic, Ollama via LangChain gateway
- Tool Discovery System: Dynamically registers tools from all connected MCP servers
- RAG System: Retrieval-Augmented Generation with JSON and OpenAI vector stores
- Agent Mode: LangChain-powered conversational agents with tool orchestration
- Monitoring: Prometheus metrics for tool usage and LLM token tracking
cmd/main.go: Application entry point and initializationinternal/config/: Configuration management with environment variable overridesinternal/slack/: Slack client implementation and message formattinginternal/mcp/: MCP client implementations (SSE, HTTP, stdio)internal/llm/: LLM provider factories and LangChain integrationinternal/rag/: RAG providers and tool implementationsinternal/handlers/: Tool handlers and LLM-MCP bridgeinternal/monitoring/: Prometheus metrics
# Build the application
make build
# Run the application
make run
# Run tests
make test
# Run linting and formatting
make lint
# Check all (format, lint, vet, test)
make check
# Clean build artifacts
make clean# Run all tests with verbose output
go test -v ./...
# Run tests with race detection
go test -race ./...
# Run tests with coverage
go test -coverprofile=coverage.out ./...# Build Docker image
make docker-build
# Run with Docker Compose
docker-compose up -dSLACK_BOT_TOKEN: Bot token for Slack APISLACK_APP_TOKEN: App-level token for Socket ModeOPENAI_API_KEY: OpenAI API key (default provider)LLM_PROVIDER: Provider selection (openai, anthropic, ollama)
mcp-servers.json: MCP server definitions and tool configurations.env: Environment variables (optional)- Config supports both legacy format and new
mcpServersformat
RAG can be enabled via LLM provider config with rag_enabled: true. Supports JSON-based storage and OpenAI vector stores.
- MCP servers are configured in
mcp-servers.jsonwith command/args or URL - Clients support stdio, HTTP, and SSE transport modes
- Tool discovery happens at startup with allow/block lists
- Failed servers are logged but don't crash the application
- Factory pattern in
internal/llm/for provider creation - LangChain gateway provides unified interface
- Environment variables override config file settings
- Supports native tools vs system prompt-based tools
- Domain-specific errors in
internal/common/errors/ - Graceful degradation when MCP servers fail
- Comprehensive logging with structured fields
- Circuit breaker pattern for failed connections
- Unit tests for core business logic
- Integration tests for MCP client functionality
- Mock interfaces for external dependencies
- Test coverage tracking in CI/CD
- Single-prompt interactions with tool descriptions in system prompt
- Direct JSON tool call parsing and execution
- Predictable token usage and conversation flow
- Multi-turn conversational interactions via LangChain agents
- Context-aware tool usage decisions
- Better user context integration and reasoning capabilities
- Enable with
use_agent: truein config
- Tool invocation counters with error rates
- LLM token usage histograms by model and type
- Metrics endpoint at
:8080/metrics(configurable)
- Structured logging with configurable levels
- Component-specific loggers for MCP servers
- Debug mode for detailed MCP communication
- Add server config to
mcp-servers.json - Test connection with
--debugflag - Verify tool discovery in logs
- Add to allow/block lists if needed
- Create factory in
internal/llm/ - Implement LangChain integration
- Add environment variable handling in
config.go - Update provider constants
- Enable
--mcpdebugfor MCP client logs - Check server initialization timeouts
- Verify NPM packages for JavaScript servers
- Test stdio vs HTTP transport modes