Edge Computing Explained for Developers (2026)
Edge computing moves your code closer to your users. Instead of one server in Virginia, your code runs in 300+ locations worldwide. Here's what it means for web developers in 2026.
What is Edge Computing?
Traditional: User in Tokyo → request travels to server in US-East → response travels back. Round trip: ~200ms.
Edge: User in Tokyo → request hits edge server in Tokyo → response from Tokyo. Round trip: ~20ms.
Edge computing runs your code in data centers close to users, reducing latency by 80-90%.
Edge vs Serverless vs Traditional
| Traditional | Serverless | Edge | |
|---|---|---|---|
| Location | 1-3 regions | 1-3 regions | 300+ locations |
| Cold start | N/A (always running) | 200-500ms | 0-5ms |
| Runtime | Full Node.js | Full Node.js | Limited (V8 isolates) |
| Max execution | Unlimited | 15 min (AWS) | 30s-5 min |
| Cost | Fixed | Per-invocation | Per-invocation |
| Latency | 50-200ms | 50-500ms | 5-30ms |
Edge Platforms
| Platform | Runtime | Locations | Free Tier |
|---|---|---|---|
| Cloudflare Workers | V8 isolates | 300+ | 100K req/day |
| Vercel Edge Functions | V8 isolates | 100+ | Part of Vercel plan |
| Deno Deploy | Deno runtime | 35+ | 100K req/day |
| Fastly Compute | Wasm | 90+ | Limited |
| AWS CloudFront Functions | JavaScript | 400+ | 2M invocations/mo |
Hello World on the Edge
Cloudflare Workers
export default {
async fetch(request: Request): Promise<Response> {
return new Response('Hello from the edge!', {
headers: { 'content-type': 'text/plain' },
})
},
}
Vercel Edge Functions
// app/api/hello/route.ts
export const runtime = 'edge'
export async function GET() {
return new Response('Hello from the edge!')
}
Deno Deploy
Deno.serve(() => new Response('Hello from the edge!'))
What You CAN Do on the Edge
1. API Responses
export default {
async fetch(request: Request) {
const url = new URL(request.url)
if (url.pathname === '/api/user') {
const user = await db.query('SELECT * FROM users WHERE id = ?', [userId])
return Response.json(user)
}
},
}
2. Authentication and Authorization
// Check JWT token at the edge — reject before hitting origin
export default {
async fetch(request: Request) {
const token = request.headers.get('Authorization')?.replace('Bearer ', '')
if (!token || !await verifyJWT(token)) {
return new Response('Unauthorized', { status: 401 })
}
return fetch(request) // Forward to origin
},
}
3. A/B Testing
// Split traffic at the edge
export default {
async fetch(request: Request) {
const bucket = Math.random() < 0.5 ? 'a' : 'b'
const url = new URL(request.url)
url.pathname = `/${bucket}${url.pathname}`
return fetch(url.toString(), {
headers: { ...request.headers, 'x-experiment': bucket },
})
},
}
4. Geolocation-Based Routing
export default {
async fetch(request: Request) {
const country = request.headers.get('CF-IPCountry')
if (country === 'DE') {
return Response.redirect('https://de.example.com')
}
return fetch(request)
},
}
5. Rate Limiting
import { Ratelimit } from '@upstash/ratelimit'
import { Redis } from '@upstash/redis'
const ratelimit = new Ratelimit({
redis: Redis.fromEnv(),
limiter: Ratelimit.slidingWindow(100, '60 s'),
})
export default {
async fetch(request: Request) {
const ip = request.headers.get('CF-Connecting-IP')!
const { success } = await ratelimit.limit(ip)
if (!success) {
return new Response('Rate limited', { status: 429 })
}
return fetch(request)
},
}
What You CAN'T Do on the Edge (Limitations)
Limited Runtime
Edge functions run V8 isolates, not full Node.js. Missing:
fs(no file system)- Most Node.js built-in modules
- Native npm packages (compiled C/C++ addons)
- Long-running processes
Execution Time Limits
- Cloudflare Workers: 30 seconds (paid), 10ms CPU time (free)
- Vercel Edge: 25 seconds
- Not suitable for heavy computation
Database Connections
Traditional TCP database connections don't work. Use:
- HTTP-based databases (Neon, Turso, PlanetScale)
- Edge-native databases (Cloudflare D1, Turso)
- Key-value stores (Cloudflare KV, Upstash Redis)
Package Compatibility
Some npm packages don't work on the edge. Check compatibility before choosing edge runtime.
When to Use Edge
✅ Use Edge When
- Latency matters (every ms counts)
- Simple request/response processing
- Authentication/authorization checks
- Geolocation-based logic
- A/B testing and feature flags
- Rate limiting
- Serving personalized content
- API responses from edge-compatible databases
❌ Don't Use Edge When
- Heavy computation (image processing, ML inference)
- Need full Node.js runtime
- Complex database transactions
- Long-running processes
- Need native npm packages
The Middleware Pattern
The most common edge pattern in Next.js:
// middleware.ts (runs on every request, at the edge)
import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
export function middleware(request: NextRequest) {
// Auth check
const token = request.cookies.get('session')
if (!token && request.nextUrl.pathname.startsWith('/dashboard')) {
return NextResponse.redirect(new URL('/sign-in', request.url))
}
// Add headers
const response = NextResponse.next()
response.headers.set('x-country', request.geo?.country || 'unknown')
return response
}
Edge + Traditional (Hybrid)
Best pattern: edge for fast decisions, traditional for heavy processing.
User Request → Edge (auth, routing, caching) → Origin Server (business logic, DB writes)
FAQ
Is edge computing always faster?
For reads and simple logic, yes. For writes and complex operations, the edge adds a hop to your origin database, which can be slower.
Do I need to rewrite my app for the edge?
No. Use edge selectively — middleware, specific API routes, and A/B testing. Keep heavy logic in traditional serverless or server functions.
What about cold starts?
Edge functions have near-zero cold starts (0-5ms) compared to serverless (200-500ms). This is edge's biggest advantage.
Cloudflare Workers vs Vercel Edge Functions?
Workers: more features (KV, D1, R2, Queues, Durable Objects). Vercel Edge: better Next.js integration, simpler. Use Workers for standalone edge apps, Vercel Edge for Next.js.
Bottom Line
Edge computing in 2026: use it for authentication, routing, rate limiting, and fast API responses. Don't use it for heavy computation or complex database operations. The hybrid pattern (edge for fast decisions, origin for heavy lifting) gives you the best of both worlds. Start with Next.js middleware — it's the easiest entry point.