A modular Node.js backend server that provides unified API endpoints for Weather, News, Stock Market, and AI services. Designed as a backend connector for AI models, enabling structured tool execution through natural language query parsing.
MCP Server (Model Context Protocol Server) acts as an intelligent middleware layer between external data services and AI systems. It integrates multiple third-party APIs and provides:
- Standardized REST endpoints
- Redis-based caching
- Natural language to structured tool-call parsing via OpenAI
- Clean, modular backend architecture
The system is built to handle real-world API interactions efficiently with caching strategies based on data volatility.
- Node.js (ES Modules)
- Express.js
- axios – HTTP client for external API calls
- dotenv – Environment variable management
- ioredis – Redis client for caching
- openai – OpenAI API integration
- nodemon – Development auto-restart
The project follows a modular backend structure:
mcp-server/
├── routes/ → API route definitions
├── controllers/ → Request handling logic
├── services/ → External API integrations & business logic
├── utils/ → Shared utilities
├── server.js → Application entry point
├── package.json → Dependencies and scripts
└── .env # Environment variables
- Separation of concerns
- Service-layer abstraction for external APIs
- Centralized error handling
- Volatility-based caching strategy
- AI-powered request parsing
- OpenWeatherMap integration
- Redis caching (TTL: 2 minutes)
- NewsAPI integration
- Returns top 5 articles by topic
- Cached responses
- API Ninjas stock data
- Redis caching (TTL: 1 minute)
- OpenAI integration
- Parses natural language into structured tool calls
- Acts as an intelligent query router
GET /api/weather/:cityReturns weather data for a specified city.
GET /api/news/:topicReturns top 5 news articles for a topic.
GET /api/stocks/:symbolReturns stock price data for a given symbol.
POST /api/ai/parseParses natural language input into structured tool calls.
The server uses Redis with differentiated TTL values based on data volatility:
| Service | TTL | Reason |
|---|---|---|
| Weather | 2 minutes | Moderate volatility |
| Stocks | 1 minute | High volatility |
| News | Short-lived caching | Frequently updated |
This reduces external API load and improves response times.
Create a .env file with the following variables:
PORT=3000
WEATHER_API_KEY=your_key
NEWS_API_KEY=your_key
STOCK_API_KEY=your_key
OPENAI_API_KEY=your_key
OPENAI_MODEL=gpt-4o-mini
REDIS_URL=redis://localhost:6379Defaults:
OPENAI_MODEL→gpt-4o-miniREDIS_URL→redis://localhost:6379
git clone https://github.com/Daniel-wesley-06/mcp-server.git
cd mcp-servernpm installnpm startnpm run devThe server runs on the port defined in the PORT environment variable.
- Implementing volatility-based caching strategies
- Integrating LLMs as routing layers for backend systems
- Structuring backend services for extensibility
- Managing secure external API integrations
- Designing an AI-ready backend connector pattern
- Health check endpoint
- Request logging middleware
- Basic test coverage
- Docker support
- CI/CD integration