Getting Started with Cloudflare Workers: Edge Computing Made Simple
Edge computing is revolutionizing how we build and deploy web applications, bringing computation closer to users for faster response times and better performance. Cloudflare Workers represents one of the most accessible and powerful platforms for edge computing, allowing developers to run JavaScript code at Cloudflare's global network of data centers.
In this comprehensive guide, we'll explore Cloudflare Workers from the ground up, covering everything from basic concepts to advanced implementation patterns that can transform your application's performance.
Understanding Cloudflare Workers
Cloudflare Workers is a serverless platform that runs JavaScript code at the edge, across Cloudflare's network of over 275 data centers worldwide. Unlike traditional serverless platforms that run in specific regions, Workers execute your code at the location closest to your users, dramatically reducing latency.
Key Benefits of Edge Computing
Ultra-Low Latency: By running code at the edge, Workers can respond to requests in milliseconds rather than the hundreds of milliseconds typical of traditional server architectures.
Global Distribution: Your code automatically runs worldwide without complex deployment configurations or regional management.
Cost Efficiency: Pay only for what you use, with no idle server costs or complex scaling configurations.
Seamless Integration: Workers integrate naturally with existing Cloudflare services like CDN, DNS, and security features.
Your First Cloudflare Worker
Let's start with a simple "Hello World" example to understand the basic structure:
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
return new Response('Hello from the edge!', {
headers: { 'content-type': 'text/plain' },
})
}
This basic Worker intercepts all HTTP requests and returns a simple text response. The addEventListener
function registers a handler for fetch events, which are triggered whenever a request hits your Worker.
Building a Practical API Gateway
Let's create a more sophisticated example that demonstrates real-world usage. This Worker acts as an API gateway, routing requests and adding authentication:
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
const url = new URL(request.url)
// Handle CORS preflight requests
if (request.method === 'OPTIONS') {
return handleCORS()
}
// Route API requests
if (url.pathname.startsWith('/api/')) {
return handleAPIRequest(request, url)
}
// Serve static content or proxy to origin
return fetch(request)
}
async function handleAPIRequest(request, url) {
// Extract API key from headers
const apiKey = request.headers.get('X-API-Key')
if (!apiKey || !await validateAPIKey(apiKey)) {
return new Response('Unauthorized', {
status: 401,
headers: getCORSHeaders()
})
}
// Route to different endpoints
switch (url.pathname) {
case '/api/data':
return handleDataRequest(request)
case '/api/status':
return handleStatusRequest()
default:
return new Response('Not Found', {
status: 404,
headers: getCORSHeaders()
})
}
}
async function validateAPIKey(apiKey) {
// In production, validate against KV storage or external service
const validKeys = ['demo-key-123', 'prod-key-456']
return validKeys.includes(apiKey)
}
async function handleDataRequest(request) {
// Simulate data processing
const data = {
timestamp: new Date().toISOString(),
edge_location: request.cf.colo,
user_country: request.cf.country
}
return new Response(JSON.stringify(data), {
headers: {
'content-type': 'application/json',
...getCORSHeaders()
}
})
}
async function handleStatusRequest() {
return new Response(JSON.stringify({
status: 'healthy',
version: '1.0.0',
edge: true
}), {
headers: {
'content-type': 'application/json',
...getCORSHeaders()
}
})
}
function handleCORS() {
return new Response(null, {
status: 204,
headers: getCORSHeaders()
})
}
function getCORSHeaders() {
return {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
'Access-Control-Allow-Headers': 'Content-Type, X-API-Key'
}
}
Advanced Pattern: Edge-Side Data Processing
One of the most powerful use cases for Cloudflare Workers is processing data at the edge before it reaches your origin servers. Here's an example that demonstrates request transformation and response caching:
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
const url = new URL(request.url)
// Handle stock data requests with edge caching
if (url.pathname.startsWith('/stock/')) {
return handleStockRequest(request, url)
}
return fetch(request)
}
async function handleStockRequest(request, url) {
const symbol = url.pathname.split('/')[2]
if (!symbol) {
return new Response('Symbol required', { status: 400 })
}
// Create cache key
const cacheKey = `stock-${symbol}-${Math.floor(Date.now() / 60000)}`
// Check cache first
const cache = caches.default
let response = await cache.match(cacheKey)
if (!response) {
// Fetch from origin API
const apiResponse = await fetchStockData(symbol)
if (apiResponse.ok) {
// Transform and enhance the data at the edge
const stockData = await apiResponse.json()
const enhancedData = await enhanceStockData(stockData, request)
response = new Response(JSON.stringify(enhancedData), {
headers: {
'content-type': 'application/json',
'cache-control': 'public, max-age=60'
}
})
// Cache the response
event.waitUntil(cache.put(cacheKey, response.clone()))
} else {
response = new Response('Stock data unavailable', { status: 503 })
}
}
return response
}
async function fetchStockData(symbol) {
// This would integrate with a service like EOD Stock API
const apiUrl = `https://eod-stock-api.org/api/real-time/${symbol}`
return fetch(apiUrl, {
headers: {
'Authorization': 'Bearer ' + STOCK_API_KEY
}
})
}
async function enhanceStockData(stockData, request) {
// Add edge-computed fields
return {
...stockData,
edge_processed: true,
user_timezone: request.cf.timezone,
processing_location: request.cf.colo,
enhanced_at: new Date().toISOString(),
// Add computed technical indicators
price_change_percent: calculatePriceChange(stockData),
volatility_indicator: calculateVolatility(stockData)
}
}
function calculatePriceChange(data) {
if (data.previous_close && data.current_price) {
return ((data.current_price - data.previous_close) / data.previous_close * 100).toFixed(2)
}
return null
}
function calculateVolatility(data) {
// Simplified volatility calculation
if (data.high && data.low && data.current_price) {
const range = data.high - data.low
const midpoint = (data.high + data.low) / 2
return (range / midpoint * 100).toFixed(2)
}
return null
}
Working with Cloudflare KV Storage
For persistent data storage at the edge, Cloudflare Workers integrate seamlessly with KV (Key-Value) storage:
addEventListener('fetch', event => {
event.respondWith(handleRequest(event.request))
})
async function handleRequest(request) {
const url = new URL(request.url)
if (url.pathname === '/api/config') {
return handleConfigRequest(request)
}
if (url.pathname.startsWith('/api/analytics/')) {
return handleAnalyticsRequest(request, url)
}
return new Response('Not Found', { status: 404 })
}
async function handleConfigRequest(request) {
if (request.method === 'GET') {
// Retrieve configuration from KV
const config = await CONFIG_KV.get('app-config', 'json')
return new Response(JSON.stringify(config || {}), {
headers: { 'content-type': 'application/json' }
})
}
if (request.method === 'POST') {
// Update configuration
const newConfig = await request.json()
await CONFIG_KV.put('app-config', JSON.stringify(newConfig))
return new Response('Configuration updated', { status: 200 })
}
return new Response('Method not allowed', { status: 405 })
}
async function handleAnalyticsRequest(request, url) {
const event = url.pathname.split('/')[3]
if (request.method === 'POST') {
// Record analytics event
const eventData = await request.json()
const timestamp = Date.now()
const key = `analytics:${event}:${timestamp}`
await ANALYTICS_KV.put(key, JSON.stringify({
...eventData,
timestamp,
user_agent: request.headers.get('user-agent'),
country: request.cf.country,
edge_location: request.cf.colo
}))
return new Response('Event recorded', { status: 201 })
}
return new Response('Method not allowed', { status: 405 })
}
Deployment and Best Practices
Local Development Setup
Use Wrangler CLI for local development and deployment:
# Install Wrangler
npm install -g wrangler
# Create new project
wrangler generate my-worker
# Local development
wrangler dev
# Deploy to production
wrangler publish
Performance Optimization Tips
Minimize Cold Starts: Keep your Worker code lightweight and avoid heavy initialization logic.
Leverage Caching: Use the Cache API aggressively to reduce origin requests and improve response times.
Optimize Bundle Size: Use tree shaking and avoid unnecessary dependencies to keep your Worker small.
Handle Errors Gracefully: Always provide fallback responses to ensure reliability.
Security Considerations
// Input validation example
function validateInput(data) {
if (typeof data !== 'object' || data === null) {
throw new Error('Invalid input format')
}
// Sanitize string inputs
Object.keys(data).forEach(key => {
if (typeof data[key] === 'string') {
data[key] = data[key].replace(/<script\b[^<]*(?:(?!<\/script>)<[^<]*)*<\/script>/gi, '')
}
})
return data
}
// Rate limiting with KV
async function checkRateLimit(clientIP) {
const key = `rate_limit:${clientIP}`
const current = await RATE_LIMIT_KV.get(key)
if (current && parseInt(current) > 100) {
return false // Rate limit exceeded
}
await RATE_LIMIT_KV.put(key, (parseInt(current) || 0) + 1, { expirationTtl: 3600 })
return true
}
Real-World Applications
Cloudflare Workers excel in several key scenarios:
API Gateways: Centralize authentication, rate limiting, and request routing at the edge.
Data Transformation: Process and enhance API responses before they reach clients.
A/B Testing: Implement feature flags and experiments without touching your main application.
Security Filtering: Block malicious requests and implement custom security rules.
Performance Optimization: Cache dynamic content and optimize resource delivery.
At Custom Logic, we leverage Cloudflare Workers extensively in our client solutions, particularly for API optimization and edge-side data processing. Our EOD Stock API implementation demonstrates how Workers can transform financial data delivery, providing real-time enhancements and regional optimizations that significantly improve user experience.
Conclusion
Cloudflare Workers represent a paradigm shift in how we think about web application architecture. By moving computation to the edge, we can achieve unprecedented performance improvements while maintaining the simplicity and cost-effectiveness of serverless computing.
The examples in this guide demonstrate just a fraction of what's possible with Workers. As you explore this technology, consider how edge computing can solve your specific performance and scalability challenges.
Whether you're building API gateways, implementing real-time data processing, or optimizing content delivery, Cloudflare Workers provide the tools and global infrastructure to make your applications faster and more reliable.
Ready to implement edge computing in your next project? Contact Custom Logic to learn how we can help you leverage Cloudflare Workers and other cutting-edge technologies to build high-performance, globally distributed applications.