Cloudflare Containers Public Beta 2025: รัน Docker Edge 320+ Cities Auto-scale
Cloudflare Containers รันคอนเทนเนอร์ทั่วโลกผ่าน Workers + Durable Objects Latency ต่ำสุด Sandbox per user Media processing AI inference Pay-per-running-time
Cloudflare Containers vs Competition
| Platform | Global Edge | Workers Integration | Cold Start | Pricing |
|---|---|---|---|---|
| Cloudflare | 320 cities | Native | 200ms | Running only |
| Fly.io | 35 regions | ❌ | 500ms | Always-on |
| Render | 8 regions | ❌ | 1s | Idle charges |
| Railway | Multi-cloud | ❌ | 800ms | Fixed scale |
Core Architecture
Request → Workers → Durable Objects → Containers
↓
Auto-scale + Health checks
คุณสมบัติหลัก
🌍 Global deployment (320+ cities)
⚙️ Workers orchestration
💤 Sleep after idle (pay-per-use)
🔒 Per-container sandboxing
🌐 HTTP/WebSocket support
📈 Auto-scaling (CPU/memory)
Quick Deploy Example
// wrangler.toml
[[containers]]
name = "ffmpeg-converter"
image = "myregistry/ffmpeg:latest”
port = 8080
sleep_after = "1m”
min_instances = 1// worker.js
export default {
async fetch(request, env) {
const container = env.FFMPEG.get(env.FFMPEG.idFromName("user123"));
return container.fetch(request);
}
}
npx wrangler deploy
Real-world Use Cases
1. Code Sandbox (Per User)
User1 → Container1 (Node.js)
User2 → Container2 (Python)
User3 → Container3 (Go)
2. FFmpeg Media Processing
Upload MP4 → Edge container → GIF output
Latency: 150ms (vs 2s centralized)
3. AI Model Inference
User request → Boot Llama.cpp → Inference → Sleep
Cost: $0.0001 per request
Pricing Model (Pay-per-running)
Running: $0.000015/GB-second
Sleep: $0 (free)
Boot time: 200ms (billed)
Min 1 instance: Configurable
ตัวอย่าง: 1000 req/day × 30s = $0.45/เดือน
Production Config
# wrangler.toml
[[containers]]
name = "app"
image = "docker.io/myapp:latest"
port = 3000
min_instances = 2
max_instances = 50
sleep_after = "5m"
cpu_threshold = 0.75
memory_threshold = 0.8
Advanced Patterns
Service Mesh via Workers
// Load balance 10 containers
const containers = Array(10).fill().map((_, i) =>
env.APP.get(env.APP.idFromName(`inst-${i}`))
);
const container = containers[Math.floor(Math.random() * 10)];
return container.fetch(request);
Health Checking
export class ContainerProxy {
async fetch(request) {
try {
const res = await this.container.fetch('/health');
if (!res.ok) throw new Error('Unhealthy');
return this.container.fetch(request);
} catch {
// Spin up new container
await this.restart();
}
}
}
Limitations
❌ No TCP/UDP (HTTP/WebSocket only)
❌ Cloudflare network only
⚠️ Cold start 200ms (acceptable)
✅ Perfect for stateless workloads
Migration from Other Platforms
Docker → Cloudflare: 5 mins setup
K8s → Containers: No YAML hell
Lambda → Containers: Full runtime
Vercel → Edge Containers: Global