Home/File & Media/Cloudflare R2
File & Media
cloudflare-r2

Cloudflare R2

Object StorageS3-compatibleZero EgressCDN

S3-compatible object storage with zero egress fees. Ideal for storing large files (images, videos, backups) without the costly data transfer bills typical of AWS S3. Accessible via the S3 API.

License

Proprietary

Language

Any

Used for
74
Trust
Good

Why Cloudflare R2?

Storing large files where AWS S3 egress costs are prohibitive

Serving user-uploaded assets through a CDN-like global network

Any S3-compatible workflow you want to migrate away from AWS

Signal Breakdown

What drives the Trust Score

Weekly npm downloads
N/A (use aws-sdk)
GitHub commits (90d)
400+
GitHub stars
9.2k (docs)
Stack Overflow questions
4k
Community health
Very Active
Weighted Trust Score74 / 100

Download Trend

Last 12 months

Tradeoffs & Caveats

Know before you commit

Need advanced S3 features like Cross-Region Replication — R2 is simpler

Already on AWS and using other AWS services tightly — S3 integration is seamless there

Need real-time event notifications — R2's event system is more limited than S3

Pricing

Free tier & paid plans

Free tier

10GB storage/mo · 1M Class A ops/mo

Paid

$0.015/GB-mo storage · $0.36/M Class A ops

Zero egress fees — major advantage over S3

Alternative Tools

Other options worth considering

cloudinary
Cloudinary86Strong

Cloud-based image and video management platform with on-the-fly transformations, CDN delivery, and AI-powered media features. Zero-config optimization for web performance.

Often Used Together

Complementary tools that pair well with Cloudflare R2

nextjs

Next.js

Frontend & UI

98Excellent
View
cloudflare-workers

Cloudflare Workers

Serverless Edge

54Limited
View
supabase

Supabase

Database & Cache

95Excellent
View
uploadthing

Uploadthing

File & Media

30Limited
View
cloudinary

Cloudinary

File & Media

86Strong
View

Learning Resources

Docs, videos, tutorials, and courses

Get Started

Repository and installation options

View on GitHub

github.com/cloudflare/cloudflare-docs

npm (S3 client)npm install @aws-sdk/client-s3 @aws-sdk/s3-request-presigner

Quick Start

Copy and adapt to get going fast

import { S3Client, PutObjectCommand, GetObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';

const R2 = new S3Client({
  region: 'auto',
  endpoint: `https://${process.env.CF_ACCOUNT_ID}.r2.cloudflarestorage.com`,
  credentials: {
    accessKeyId: process.env.R2_ACCESS_KEY_ID!,
    secretAccessKey: process.env.R2_SECRET_ACCESS_KEY!,
  },
});

// Generate presigned upload URL
const url = await getSignedUrl(
  R2,
  new PutObjectCommand({ Bucket: 'assets', Key: 'user-upload.jpg' }),
  { expiresIn: 3600 }
);
console.log('Presigned URL:', url);

Code Examples

Common usage patterns

Presigned upload URL for browser

Generate a presigned URL for direct browser-to-R2 uploads

// app/api/upload-url/route.ts
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';

const R2 = new S3Client({
  region: 'auto',
  endpoint: `https://${process.env.CF_ACCOUNT_ID}.r2.cloudflarestorage.com`,
  credentials: {
    accessKeyId: process.env.R2_ACCESS_KEY_ID!,
    secretAccessKey: process.env.R2_SECRET_ACCESS_KEY!,
  },
});

export async function POST(req: Request) {
  const { filename, contentType } = await req.json();
  const key = `uploads/${Date.now()}-${filename}`;
  const url = await getSignedUrl(
    R2,
    new PutObjectCommand({ Bucket: 'my-bucket', Key: key, ContentType: contentType }),
    { expiresIn: 300 }
  );
  return Response.json({ url, key });
}

List and delete objects

List all files in a bucket and delete old ones

import { S3Client, ListObjectsV2Command, DeleteObjectsCommand } from '@aws-sdk/client-s3';

const R2 = new S3Client({ /* ...config */ });

const { Contents } = await R2.send(new ListObjectsV2Command({ Bucket: 'my-bucket', Prefix: 'temp/' }));

const cutoff = Date.now() - 24 * 60 * 60 * 1000; // 24 hours ago
const toDelete = (Contents ?? []).filter((obj) => obj.LastModified!.getTime() < cutoff);

if (toDelete.length > 0) {
  await R2.send(new DeleteObjectsCommand({
    Bucket: 'my-bucket',
    Delete: { Objects: toDelete.map(({ Key }) => ({ Key: Key! })) },
  }));
  console.log(`Deleted ${toDelete.length} old files`);
}

Community Notes

Real experiences from developers who've used this tool