Building Scalable Microservices with Node.js and Docker: A Complete Guide
Introduction
Microservices architecture has become the go-to solution for building scalable, maintainable applications. As a full-stack developer, I've seen firsthand how breaking down monolithic applications into smaller, focused services can dramatically improve development velocity and system reliability. In this guide, we'll explore how to build a microservices architecture using Node.js and Docker, complete with service discovery, API gateways, and inter-service communication.
Understanding Microservices Architecture
Microservices architecture is a design approach where applications are built as a collection of loosely coupled services. Each service is:
- Independently deployable
- Focused on a specific business capability
- Owned by a small team
- Technology agnostic
- Communicates via well-defined APIs
This contrasts with monolithic architecture where all functionality is deployed as a single unit.
Setting Up the Foundation
Let's start by creating a simple microservices ecosystem with three services: User Service, Product Service, and Order Service.
User Service Implementation
// user-service/src/app.js
const express = require('express');
const mongoose = require('mongoose');
const cors = require('cors');
const helmet = require('helmet');
const app = express();
const PORT = process.env.PORT || 3001;
// Middleware
app.use(helmet());
app.use(cors());
app.use(express.json());
// User Schema
const userSchema = new mongoose.Schema({
name: { type: String, required: true },
email: { type: String, required: true, unique: true },
createdAt: { type: Date, default: Date.now }
});
const User = mongoose.model('User', userSchema);
// Routes
app.get('/health', (req, res) => {
res.json({ status: 'healthy', service: 'user-service' });
});
app.post('/users', async (req, res) => {
try {
const user = new User(req.body);
await user.save();
res.status(201).json(user);
} catch (error) {
res.status(400).json({ error: error.message });
}
});
app.get('/users/:id', async (req, res) => {
try {
const user = await User.findById(req.params.id);
if (!user) return res.status(404).json({ error: 'User not found' });
res.json(user);
} catch (error) {
res.status(400).json({ error: error.message });
}
});
// Database connection
mongoose.connect(process.env.MONGODB_URI || 'mongodb://localhost:27017/userdb')
.then(() => console.log('Connected to User Database'))
.catch(err => console.error('Database connection error:', err));
app.listen(PORT, () => {
console.log(`User service running on port ${PORT}`);
});Dockerizing the Services
# user-service/Dockerfile
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY src/ ./src/
EXPOSE 3001
USER node
CMD ["node", "src/app.js"]Implementing Service Communication
Microservices need to communicate with each other. Let's implement HTTP-based communication with proper error handling:
// shared/serviceClient.js
const axios = require('axios');
const CircuitBreaker = require('opossum');
class ServiceClient {
constructor(baseURL, timeout = 5000) {
this.client = axios.create({
baseURL,
timeout,
headers: {
'Content-Type': 'application/json'
}
});
// Circuit breaker configuration
const options = {
timeout: 3000,
errorThresholdPercentage: 50,
resetTimeout: 30000
};
this.breaker = new CircuitBreaker(this.makeRequest.bind(this), options);
}
async makeRequest(config) {
try {
const response = await this.client(config);
return response.data;
} catch (error) {
throw new Error(`Service call failed: ${error.message}`);
}
}
async get(endpoint) {
return this.breaker.fire({ method: 'GET', url: endpoint });
}
async post(endpoint, data) {
return this.breaker.fire({ method: 'POST', url: endpoint, data });
}
}
module.exports = ServiceClient;Creating an API Gateway
An API gateway serves as the single entry point for all client requests:
// api-gateway/src/gateway.js
const express = require('express');
const { createProxyMiddleware } = require('http-proxy-middleware');
const rateLimit = require('express-rate-limit');
const jwt = require('jsonwebtoken');
const app = express();
const PORT = process.env.PORT || 3000;
// Rate limiting
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100 // limit each IP to 100 requests per windowMs
});
app.use(limiter);
app.use(express.json());
// Authentication middleware
const authenticateToken = (req, res, next) => {
const authHeader = req.headers['authorization'];
const token = authHeader && authHeader.split(' ')[1];
if (!token) {
return res.sendStatus(401);
}
jwt.verify(token, process.env.JWT_SECRET, (err, user) => {
if (err) return res.sendStatus(403);
req.user = user;
next();
});
};
// Service proxy configurations
const userServiceProxy = createProxyMiddleware({
target: process.env.USER_SERVICE_URL || 'http://user-service:3001',
changeOrigin: true,
pathRewrite: {
'^/api/users': '/users'
}
});
const productServiceProxy = createProxyMiddleware({
target: process.env.PRODUCT_SERVICE_URL || 'http://product-service:3002',
changeOrigin: true,
pathRewrite: {
'^/api/products': '/products'
}
});
// Routes
app.use('/api/users', authenticateToken, userServiceProxy);
app.use('/api/products', productServiceProxy);
// Health check
app.get('/health', (req, res) => {
res.json({ status: 'healthy', service: 'api-gateway' });
});
app.listen(PORT, () => {
console.log(`API Gateway running on port ${PORT}`);
});Docker Compose Configuration
Let's orchestrate all services using Docker Compose:
# docker-compose.yml
version: '3.8'
services:
api-gateway:
build: ./api-gateway
ports:
- "3000:3000"
environment:
- USER_SERVICE_URL=http://user-service:3001
- PRODUCT_SERVICE_URL=http://product-service:3002
- JWT_SECRET=your-secret-key
depends_on:
- user-service
- product-service
user-service:
build: ./user-service
environment:
- MONGODB_URI=mongodb://mongo:27017/userdb
- PORT=3001
depends_on:
- mongo
product-service:
build: ./product-service
environment:
- MONGODB_URI=mongodb://mongo:27017/productdb
- PORT=3002
depends_on:
- mongo
mongo:
image: mongo:6
volumes:
- mongo_data:/data/db
ports:
- "27017:27017"
redis:
image: redis:alpine
ports:
- "6379:6379"
volumes:
mongo_data:Best Practices and Considerations
Monitoring and Observability
Implement comprehensive logging and monitoring:
- Use structured logging (JSON format)
- Implement distributed tracing
- Set up health checks for each service
- Monitor service dependencies and performance metrics
Data Management
Each microservice should own its data:
- Avoid shared databases between services
- Use event sourcing for complex business processes
- Implement eventual consistency patterns
- Consider CQRS for read/write optimization
Security
- Implement service-to-service authentication
- Use HTTPS for all communications
- Validate and sanitize all inputs
- Implement proper error handling without exposing internals
Conclusion
Building microservices with Node.js and Docker provides a robust foundation for scalable applications. The key is to start simple, focus on clear service boundaries, and gradually add complexity as needed. Remember that microservices introduce distributed system challenges, so ensure you have proper monitoring, testing, and deployment strategies in place.
The architecture we've built here provides a solid starting point that can be extended with additional patterns like event-driven communication, service mesh, and advanced deployment strategies as your system grows.
Related Posts
Building Scalable Event-Driven Architecture: From Theory to Implementation
Master event-driven architecture patterns to build scalable, loosely-coupled systems that handle high-throughput scenarios effectively.
Building Scalable Event-Driven Architecture with Node.js and Redis
Learn how to implement robust event-driven systems using Node.js and Redis for better scalability and decoupling.
Building Scalable Microservices with Node.js and Docker: A Complete Guide
Learn how to architect, develop, and deploy microservices using Node.js and Docker with practical examples and best practices.