Supabase Edge vs Workers vs Vercel Functions 2026
Serverless functions have fragmented into three distinct camps in 2026, each optimized for different priorities. Supabase Edge Functions are co-located with your database. Cloudflare Workers sit at 300+ edge locations with V8 isolates and zero cold starts. Vercel Functions live inside your Next.js deployment with the tightest framework integration on the market.
Pick the wrong one and you're either paying 10x more than you need to, fighting cold start latency, or wrestling with runtime incompatibilities. This comparison covers what matters: cold starts, pricing at real scale, runtime limits, and when each is the right tool.
Architecture Overview
Supabase Edge Functions
Supabase runs edge functions on Deno Deploy. This is a significant architectural choice: your functions run in the Deno runtime, not Node.js, which means no node_modules and slightly different API surface. The payoff is deep integration with your Supabase stack — your functions run with automatic Supabase auth context, can query Postgres directly with low latency, and share the same environment variables as your database.
The co-location matters. When your function needs to JOIN tables, run a transaction, or read from a Postgres trigger, being on the same infrastructure eliminates the round-trip that other edge platforms incur.
Cloudflare Workers
Cloudflare Workers run in V8 isolates — not containers, not VMs. This is what enables their headline feature: zero cold starts. Traditional serverless functions (Lambda, early Vercel Functions) can take 100ms-2s to cold-start a container. V8 isolates start in under 5ms, every time.
Workers run at 300+ edge locations globally, which means your code executes within milliseconds of users in Tokyo, São Paulo, and Johannesburg — not just US-East.
The runtime is JavaScript/TypeScript with a Web Standards API (fetch, Request, Response, crypto). It deliberately lacks Node.js compatibility for full isolation, though the nodejs_compat flag adds significant coverage for common packages.
Free tier: 100,000 requests/day, no credit card required. Paid ($5/month): 10 million requests/month included, then $0.30/million.
Vercel Functions
Vercel Functions run on Cloudflare's infrastructure at the edge, or on AWS Lambda in specific regions for serverless (non-edge) mode. The key differentiator is the framework integration: in a Next.js app, you write export async function GET(request: Request) in a file under app/api/ and Vercel handles everything else.
Cold starts exist in the serverless (Lambda) runtime — typically 200ms-800ms for Node.js. The Edge Runtime eliminates cold starts but restricts you to Web Standards APIs (same restriction as raw Cloudflare Workers).
Pricing is consumption-based at $0.40 per million executions on Pro, with a generous free tier on Hobby plans. The cost structure is competitive with AWS but more expensive than Cloudflare Workers at volume.
Comparison Table
| Feature | Supabase Edge | Cloudflare Workers | Vercel Functions |
|---|---|---|---|
| Runtime | Deno | V8 isolate (JS/TS) | Node.js or Edge (Web APIs) |
| Cold starts | ~100ms | 0ms (V8 isolate) | 0ms (edge) / 200-800ms (Node) |
| Edge locations | Deno Deploy (~35) | 300+ | ~70 (Edge), regional (Node) |
| Free tier | 500k invocations/mo | 100k req/day | 100k req/mo (Hobby) |
| Paid pricing | $25/month (Pro plan) | $5/month + $0.30/M req | $0.40/M executions (Pro) |
| Max execution time | 400ms (default) | 30s (CPU), unlimited (wall) | 10s (Edge), 300s (Node) |
| Secrets management | Supabase Vault / env vars | Workers Secrets | Vercel env vars |
| Database access | Native Postgres (low latency) | Via HTTP/D1/Hyperdrive | Via HTTP only |
| Node.js compatibility | No (Deno) | Partial (nodejs_compat) | Full (Node runtime) |
| Next.js integration | Manual | Manual | Native |
Cold Start Reality Check
Cold start numbers in documentation rarely match production. Here's what teams report in 2026:
Cloudflare Workers: Sub-5ms consistently. The V8 isolate model means there's no container to spin up — your code runs directly in an already-warm V8 context. This is the only platform where cold starts are genuinely not a concern.
Vercel Edge Functions: 0-10ms cold starts on the Edge Runtime. On par with Cloudflare Workers because Vercel's edge uses the same underlying infrastructure.
Vercel Serverless Functions (Node.js): 150-600ms for TypeScript functions with ts-node or similar. Improves with --nodejs-compat and pruned dependencies. Teams report p99 around 800ms for larger function bundles.
Supabase Edge Functions: 50-200ms depending on Deno Deploy region proximity. Not zero, not terrible. The benefit isn't cold start performance — it's the database round-trip you don't have.
Pricing at Scale
Let's price out a real workload: an API handling 50 million requests/month.
Cloudflare Workers (Paid plan):
- Base: $5/month
- 50M - 10M included = 40M overage × $0.30/M = $12
- Total: ~$17/month
Vercel Functions (Pro):
- Included: 1M/month, then $0.40/M
- 50M - 1M = 49M × $0.40/M = $19.60
- Total: ~$20/month (plus Pro plan at $20/month for the team)
Supabase Edge Functions:
- Pro plan: $25/month includes 2M invocations
- Additional: $2 per additional million
- 50M - 2M = 48M × $2/M = $96
- Total: ~$121/month
For pure volume, Cloudflare Workers wins decisively. Supabase's pricing is designed for functions that do heavy database work per invocation (where the per-call cost is justified by avoiding separate DB server costs), not high-frequency lightweight calls.
Code Examples: Same Function in All Three
Let's implement a simple API endpoint that validates a user token and returns their subscription tier.
Supabase Edge Function
// supabase/functions/subscription-tier/index.ts
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';
Deno.serve(async (req: Request) => {
const authHeader = req.headers.get('Authorization');
if (!authHeader) {
return new Response(JSON.stringify({ error: 'Unauthorized' }), {
status: 401,
headers: { 'Content-Type': 'application/json' },
});
}
const supabase = createClient(
Deno.env.get('SUPABASE_URL')!,
Deno.env.get('SUPABASE_ANON_KEY')!,
{ global: { headers: { Authorization: authHeader } } }
);
const { data: { user }, error } = await supabase.auth.getUser();
if (error || !user) {
return new Response(JSON.stringify({ error: 'Invalid token' }), {
status: 401,
headers: { 'Content-Type': 'application/json' },
});
}
// Direct Postgres query — no network hop
const { data: subscription } = await supabase
.from('subscriptions')
.select('tier, expires_at')
.eq('user_id', user.id)
.single();
return new Response(JSON.stringify({ tier: subscription?.tier ?? 'free' }), {
headers: { 'Content-Type': 'application/json' },
});
});
The Supabase version is the shortest. Auth context flows through automatically, and the Postgres query runs on co-located infrastructure.
Cloudflare Worker
// src/worker.ts
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const authHeader = request.headers.get('Authorization');
if (!authHeader?.startsWith('Bearer ')) {
return Response.json({ error: 'Unauthorized' }, { status: 401 });
}
const token = authHeader.slice(7);
// Verify JWT against Supabase (or your auth provider)
const userResponse = await fetch(`${env.SUPABASE_URL}/auth/v1/user`, {
headers: {
Authorization: `Bearer ${token}`,
apikey: env.SUPABASE_ANON_KEY,
},
});
if (!userResponse.ok) {
return Response.json({ error: 'Invalid token' }, { status: 401 });
}
const user = await userResponse.json();
// Query your database via Hyperdrive or D1
const result = await env.DB.prepare(
'SELECT tier FROM subscriptions WHERE user_id = ? LIMIT 1'
).bind(user.id).first();
return Response.json({ tier: result?.tier ?? 'free' });
},
};
interface Env {
SUPABASE_URL: string;
SUPABASE_ANON_KEY: string;
DB: D1Database;
}
Cloudflare Workers require manual auth verification and an external database call (via D1 or Hyperdrive). More setup, but the zero cold-start and global edge distribution is unmatched.
Vercel Function (Next.js App Router)
// app/api/subscription-tier/route.ts
import { createClient } from '@supabase/supabase-js';
import { NextRequest, NextResponse } from 'next/server';
export const runtime = 'edge'; // Remove for Node.js runtime
export async function GET(request: NextRequest) {
const authHeader = request.headers.get('authorization');
if (!authHeader?.startsWith('Bearer ')) {
return NextResponse.json({ error: 'Unauthorized' }, { status: 401 });
}
const supabase = createClient(
process.env.SUPABASE_URL!,
process.env.SUPABASE_ANON_KEY!,
{ global: { headers: { Authorization: authHeader } } }
);
const { data: { user }, error } = await supabase.auth.getUser();
if (error || !user) {
return NextResponse.json({ error: 'Invalid token' }, { status: 401 });
}
const { data: subscription } = await supabase
.from('subscriptions')
.select('tier')
.eq('user_id', user.id)
.single();
return NextResponse.json({ tier: subscription?.tier ?? 'free' });
}
The Vercel version is the most familiar for Next.js developers. The export const runtime = 'edge' line switches between cold-start-free Edge Runtime and full Node.js compatibility.
Secrets Management
Supabase Edge Functions use supabase secrets set KEY=VALUE. Secrets are encrypted and injected as Deno.env.get(). The UI in the Supabase dashboard provides a GUI for managing them.
Cloudflare Workers use wrangler secret put KEY and bind them in wrangler.toml. At runtime, they're available on the env object passed to your handler. Cloudflare also offers secrets in the Workers dashboard.
Vercel Functions use Vercel's environment variable system — set in the dashboard or via CLI, available as process.env.KEY. The system distinguishes between Production, Preview, and Development environments, which is genuinely useful for managing multiple deployment stages.
Runtime Constraints in Detail
Understanding the runtime limits isn't just a technical detail — it determines what you can and cannot build on each platform.
Supabase Edge Functions
Supabase Edge Functions run in the Deno runtime, which deliberately diverges from Node.js. The implications for developers:
The good news: Deno has full Web Standards API support (fetch, WebSocket, crypto, URL), TypeScript support out of the box, and a permissions model that's more secure than Node.js. For functions that call external APIs, process webhooks, or run business logic, the Deno environment is capable and pleasant.
The friction: you cannot use npm packages directly. Instead, you import from URLs (esm.sh provides npm-to-ESM conversion) or use Deno's own registry. The Supabase client library has Deno-compatible imports. But if your function needs a complex npm dependency tree — say, a PDF processing library or an image manipulation library with native bindings — you'll hit walls.
The default CPU timeout is 400ms (wall clock), though this can be increased. For CPU-intensive operations, this is tight. For I/O-bound functions (the majority of web API work), it's fine.
Memory limit: 512MB per function invocation. Sufficient for most use cases.
Cloudflare Workers
Workers run in V8 isolates with deliberately tight resource constraints by design:
- CPU time: 10ms on free tier, 30 seconds on paid. Note that this is CPU time, not wall-clock time. A Worker can spend minutes waiting for a fetch response, but can only actively compute for 30 seconds.
- Memory: 128MB per isolate
- Script size: 10MB (uncompressed, including dependencies)
- Subrequests: 50 per invocation (outbound fetch calls)
These constraints push Workers toward lightweight, I/O-focused functions. For heavy computation, preprocessing, or large dependency bundles, you'll need to architect carefully. The nodejs_compat compatibility flag lets you use many Node.js APIs, but the script size limit means you cannot simply bundle a large Node.js application.
The paid plan's 30-second CPU time limit is much more workable. Most production Workers use a tiny fraction of this — 30 seconds of CPU would represent a very computation-heavy function.
Vercel Functions
Vercel offers the most flexibility with two distinct runtimes:
The Edge Runtime has the same V8 isolate model as Cloudflare Workers, with the same Web Standards API surface. Maximum execution time is 25 seconds, memory is 128MB. The zero cold-start benefit applies here.
The Node.js Runtime (serverless, not edge) runs in AWS Lambda containers. You get full Node.js compatibility: any npm package, any native module (that Lambda supports), up to 300 seconds execution time, and up to 3GB memory. The cost is cold starts — 150-800ms when the Lambda container needs to spin up.
For most Next.js API routes, the Node.js runtime is the default and appropriate choice. Switch to the Edge Runtime only when you specifically need zero cold starts and can accept the Web API restrictions.
Deployment Workflow Comparison
The developer workflow for each platform differs substantially, and for teams working quickly, the deployment iteration loop matters.
Supabase Edge Functions
# Install Supabase CLI
npm install -g supabase
# Serve locally (with hot reload)
supabase functions serve subscription-tier --env-file .env.local
# Deploy to Supabase
supabase functions deploy subscription-tier
# Set secrets
supabase secrets set STRIPE_WEBHOOK_SECRET=whsec_...
Local development uses Deno under the hood. The supabase functions serve command runs functions locally with access to your local Supabase instance. The iteration loop is fast.
The limitation: you can't run the full Supabase Edge Function stack in a standard CI/CD pipeline without the Supabase CLI and a linked project. Teams using GitOps need to integrate the Supabase CLI into their pipeline.
Cloudflare Workers
# Install Wrangler CLI
npm install -g wrangler
# Develop locally (no internet required)
wrangler dev
# Deploy
wrangler deploy
# Set secrets
wrangler secret put STRIPE_SECRET_KEY
Wrangler's local development mode is excellent: it simulates the V8 isolate environment, supports hot reload, and can connect to remote Cloudflare services (KV, D1, R2) or local simulations. Most Cloudflare Workers can be developed and tested entirely offline.
Cloudflare's GitHub Actions integration makes CI/CD straightforward: push to main, and a workflow calls wrangler deploy automatically.
Vercel Functions
Vercel's workflow is the most integrated for Next.js:
# Local development (functions run automatically)
vercel dev
# Deploy
vercel --prod
# Environment variables managed in dashboard or CLI
vercel env add STRIPE_SECRET_KEY
The killer feature: vercel dev runs your entire Next.js application — frontend, API routes, and edge functions — exactly as they'll run in production, including environment variable injection and middleware execution. This local parity with production is Vercel's strongest developer experience advantage.
Preview deployments are automatic: every git branch gets a unique URL with its own function deployments. This makes testing isolated changes trivial.
Real-World Architecture Patterns
Pattern 1: Supabase + Supabase Edge Functions (Tight Integration)
The most common Supabase pattern: use Edge Functions exclusively for operations that need to combine auth context with database access. Webhooks from payment processors (Stripe), third-party API calls that need database writes, and scheduled cleanup jobs.
For everything else — static API routes, simple data fetching — use Supabase's built-in REST API and RLS (Row Level Security) policies. You often don't need a function at all.
Pattern 2: Cloudflare Workers + D1 (Global Edge Database)
Cloudflare D1 is a SQLite-based edge database that co-locates with Workers. For read-heavy APIs with global users, the Workers + D1 combination is unbeatable on latency: zero cold starts, SQLite reads happening at the same edge node as the request, globally distributed.
The constraint: D1 is SQLite, not Postgres. For complex relational data or heavy write workloads, it has limits. But for API responses that can tolerate eventual consistency or read-through caching, this pattern is extremely fast and cost-effective.
Pattern 3: Vercel Functions + Separate Database (Next.js Default)
The majority of Next.js applications on Vercel use serverless functions with an external database (Supabase, Neon, PlanetScale, or similar). The functions run on Lambda with full Node.js support, the database is elsewhere, and Vercel handles deployment and scaling.
This is the highest-friction path — cold starts, external database latency — but it gives you maximum flexibility in your database choice and the full Node.js ecosystem.
When to Choose Each
Choose Supabase Edge Functions when your functions are tightly coupled to your Supabase database. If 80% of what your function does is query or mutate Postgres, the co-location advantage and simplified auth context pay off. Avoid it for high-frequency lightweight operations — the per-invocation pricing stings at volume.
Choose Cloudflare Workers when latency and global distribution are paramount. Zero cold starts, 300+ edge locations, and the lowest cost per million requests make Workers the best pure-performance choice. The Deno/Node friction is real but manageable with the nodejs_compat flag.
Choose Vercel Functions when you're building a Next.js application and want zero friction. The app/api/ convention, shared environment variables with your frontend, and preview deployments that include your functions make the developer experience unbeatable for teams already on Vercel. Just be deliberate about choosing the Edge Runtime vs. Node.js runtime — the trade-offs are significant.
See also: Best Serverless Function Platforms 2026, Cloudflare Workers vs Vercel Edge vs Lambda Edge, Supabase vs Firebase Developer Comparison