How to Upload Files to Cloudflare R2 from Node.js
·APIScout Team
cloudflare r2object storagefile uploadtutorialapi integration
How to Upload Files to Cloudflare R2 from Node.js
Cloudflare R2 is S3-compatible object storage with zero egress fees. You use the same AWS SDK you already know, but pay nothing for data transfer out. This guide covers uploads, downloads, presigned URLs, and the migration path from S3.
What You'll Build
- File upload and download (images, PDFs, any file type)
- Presigned URLs for direct browser uploads
- Public bucket with custom domain
- File listing and deletion
- Multipart uploads for large files
Prerequisites: Node.js 18+, Cloudflare account (R2 free tier: 10GB storage, 10M reads/month).
1. Setup
Create R2 Bucket
- Go to Cloudflare Dashboard → R2 Object Storage
- Click "Create bucket"
- Name it (e.g.,
my-app-uploads) - Choose location hint (Auto or specific region)
Generate API Token
- R2 → Manage R2 API Tokens → Create API Token
- Permissions: Object Read & Write
- Specify bucket (or all buckets)
- Copy the Access Key ID, Secret Access Key, and Account ID
Install SDK
npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner
Initialize Client
// lib/r2.ts
import { S3Client } from '@aws-sdk/client-s3';
export const r2 = new S3Client({
region: 'auto',
endpoint: `https://${process.env.CLOUDFLARE_ACCOUNT_ID}.r2.cloudflarestorage.com`,
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID!,
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY!,
},
});
export const BUCKET_NAME = process.env.R2_BUCKET_NAME!;
Environment Variables
# .env.local
CLOUDFLARE_ACCOUNT_ID=your_account_id
R2_ACCESS_KEY_ID=your_access_key
R2_SECRET_ACCESS_KEY=your_secret_key
R2_BUCKET_NAME=my-app-uploads
R2_PUBLIC_URL=https://files.yourdomain.com # If using custom domain
2. Upload Files
Server-Side Upload
// lib/upload.ts
import { PutObjectCommand } from '@aws-sdk/client-s3';
import { r2, BUCKET_NAME } from './r2';
import { randomUUID } from 'crypto';
export async function uploadFile(
file: Buffer,
contentType: string,
folder: string = 'uploads'
) {
const key = `${folder}/${randomUUID()}-${Date.now()}`;
await r2.send(new PutObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
Body: file,
ContentType: contentType,
}));
return {
key,
url: `${process.env.R2_PUBLIC_URL}/${key}`,
};
}
Upload API Route (Next.js)
// app/api/upload/route.ts
import { NextResponse } from 'next/server';
import { uploadFile } from '@/lib/upload';
export async function POST(req: Request) {
const formData = await req.formData();
const file = formData.get('file') as File;
if (!file) {
return NextResponse.json({ error: 'No file provided' }, { status: 400 });
}
// Validate file type
const allowedTypes = ['image/jpeg', 'image/png', 'image/webp', 'application/pdf'];
if (!allowedTypes.includes(file.type)) {
return NextResponse.json({ error: 'File type not allowed' }, { status: 400 });
}
// Validate file size (10MB max)
if (file.size > 10 * 1024 * 1024) {
return NextResponse.json({ error: 'File too large' }, { status: 400 });
}
const buffer = Buffer.from(await file.arrayBuffer());
const result = await uploadFile(buffer, file.type, 'images');
return NextResponse.json(result);
}
Upload from Client
// components/FileUpload.tsx
'use client';
import { useState } from 'react';
export function FileUpload() {
const [uploading, setUploading] = useState(false);
const [url, setUrl] = useState<string | null>(null);
const handleUpload = async (e: React.ChangeEvent<HTMLInputElement>) => {
const file = e.target.files?.[0];
if (!file) return;
setUploading(true);
const formData = new FormData();
formData.append('file', file);
const res = await fetch('/api/upload', {
method: 'POST',
body: formData,
});
const data = await res.json();
setUrl(data.url);
setUploading(false);
};
return (
<div>
<input type="file" onChange={handleUpload} disabled={uploading} />
{uploading && <p>Uploading...</p>}
{url && <p>Uploaded: <a href={url}>{url}</a></p>}
</div>
);
}
3. Presigned URLs (Direct Browser Upload)
Skip your server — let the browser upload directly to R2:
// app/api/upload-url/route.ts
import { NextResponse } from 'next/server';
import { PutObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
import { r2, BUCKET_NAME } from '@/lib/r2';
import { randomUUID } from 'crypto';
export async function POST(req: Request) {
const { contentType, filename } = await req.json();
const key = `uploads/${randomUUID()}-${filename}`;
const signedUrl = await getSignedUrl(
r2,
new PutObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
ContentType: contentType,
}),
{ expiresIn: 3600 } // 1 hour
);
return NextResponse.json({ uploadUrl: signedUrl, key });
}
Client-Side Direct Upload
async function uploadDirect(file: File) {
// 1. Get presigned URL from your server
const res = await fetch('/api/upload-url', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
contentType: file.type,
filename: file.name,
}),
});
const { uploadUrl, key } = await res.json();
// 2. Upload directly to R2 (no server processing)
await fetch(uploadUrl, {
method: 'PUT',
body: file,
headers: { 'Content-Type': file.type },
});
return key;
}
4. Download Files
Get Object
import { GetObjectCommand } from '@aws-sdk/client-s3';
export async function downloadFile(key: string) {
const response = await r2.send(new GetObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
}));
return {
body: response.Body,
contentType: response.ContentType,
contentLength: response.ContentLength,
};
}
Presigned Download URL
import { GetObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
export async function getDownloadUrl(key: string) {
return getSignedUrl(
r2,
new GetObjectCommand({ Bucket: BUCKET_NAME, Key: key }),
{ expiresIn: 3600 }
);
}
5. List and Delete Files
import { ListObjectsV2Command, DeleteObjectCommand } from '@aws-sdk/client-s3';
// List files in a folder
export async function listFiles(prefix: string = '') {
const response = await r2.send(new ListObjectsV2Command({
Bucket: BUCKET_NAME,
Prefix: prefix,
MaxKeys: 100,
}));
return response.Contents?.map(obj => ({
key: obj.Key!,
size: obj.Size!,
lastModified: obj.LastModified!,
})) ?? [];
}
// Delete a file
export async function deleteFile(key: string) {
await r2.send(new DeleteObjectCommand({
Bucket: BUCKET_NAME,
Key: key,
}));
}
6. Public Bucket with Custom Domain
Enable Public Access
- R2 → Your bucket → Settings → Public Access
- Enable "Allow Access" → Adds
r2.devsubdomain - Or connect a custom domain (recommended)
Custom Domain Setup
- Add a CNAME record:
files.yourdomain.com→ your R2 bucket's public URL - Cloudflare automatically handles SSL
Files are now accessible at: https://files.yourdomain.com/uploads/image.jpg
Pricing Comparison
| Feature | Cloudflare R2 | AWS S3 |
|---|---|---|
| Storage | $0.015/GB/month | $0.023/GB/month |
| Reads (GET) | $0.36/million | $0.40/million |
| Writes (PUT) | $4.50/million | $5.00/million |
| Egress | $0 (free) | $0.09/GB |
| Free tier | 10GB + 10M reads | 5GB + 20K reads |
Example: 100GB storage + 1TB egress/month:
- R2: $1.50 (storage only)
- S3: $2.30 + $92.16 (egress) = $94.46
Common Mistakes
| Mistake | Impact | Fix |
|---|---|---|
| Exposing R2 credentials to client | Account compromise | Use presigned URLs for direct uploads |
| No file type validation | Malicious file uploads | Validate MIME type server-side |
| No file size limits | Storage abuse | Enforce max size (presigned URL + server) |
Using r2.dev domain in production | Rate limited, no caching | Use custom domain |
| Not setting Content-Type on upload | Files download instead of display | Always set ContentType |
Choosing object storage? Compare Cloudflare R2 vs AWS S3 vs Backblaze B2 on APIScout — pricing, egress fees, and developer experience.