FastWorker vs Celery
Honest, detailed comparison of FastWorker and Celery — setup, features, scalability, failure modes, and when to choose each for Python task queueing.
Celery is the default answer for “Python task queue” and has been for over a decade. It’s mature, flexible, battle-tested, and has a huge ecosystem. It’s also a non-trivial amount of infrastructure. FastWorker takes the other side of that tradeoff: much smaller feature surface, dramatically simpler operations.
This is the honest comparison.
The one-line summary
Celery is the right answer when you need durability, complex workflows, or extreme scale. FastWorker is the right answer when you want a distributed task queue that ships in 2–3 Python processes and zero external services.
Feature matrix
| Capability | FastWorker | Celery |
|---|---|---|
| External broker required | None | Redis / RabbitMQ / SQS |
| Services in a minimal deploy | 2–3 Python processes | 4–6+ |
| Setup time | < 5 minutes | 30+ minutes |
| Built-in dashboard | Yes (auto-starts at :8080) | Flower (separate) |
| FastAPI integration | Native async client | Sync / thread wrapper |
| Priority queues | 4 levels, built-in | Configurable via routing |
| Automatic worker discovery | Yes | No (static config) |
| Task chains / groups / chords | No | Yes (Canvas) |
| Scheduled / periodic tasks | No (use cron / K8s CronJob) | Celery beat |
| Task persistence | In-memory (1h TTL) | Durable (broker + result backend) |
| Exactly-once delivery | No (at-most-once) | Configurable |
| Retries with backoff | Manual in task body | Decorator |
| Multi-language workers | No (Python only) | No (Python only) |
| OpenTelemetry integration | Built-in | Via instrumentation package |
| Result storage | In-memory LRU | Broker backend (persistent) |
| Throughput ceiling | ~10K tasks/min | 100K+ tasks/min |
| Recommended worker count | 1–50 | Thousands |
Deployment: side by side
Celery + Redis — docker-compose.yml
services:
web:
build: .
environment:
- CELERY_BROKER_URL=redis://redis:6379/0
- CELERY_RESULT_BACKEND=redis://redis:6379/0
depends_on: [redis]
redis:
image: redis:7-alpine
volumes: [redis_data:/data]
command: redis-server --appendonly yes
worker:
build: .
command: celery -A tasks worker --loglevel=info
depends_on: [redis]
flower:
build: .
command: celery -A tasks flower
ports: ["5555:5555"]
volumes:
redis_data:
Four services, one volume, two Docker images, one external dependency.
FastWorker — docker-compose.yml
services:
web:
build: .
environment:
- FASTWORKER_DISCOVERY_ADDRESS=tcp://control-plane:5550
depends_on: [control-plane]
control-plane:
build: .
command: fastworker control-plane --task-modules tasks
ports: ["8080:8080"]
subworker:
build: .
command: fastworker subworker --task-modules tasks
environment:
- FASTWORKER_CONTROL_PLANE_ADDRESS=tcp://control-plane:5555
depends_on: [control-plane]
deploy:
replicas: 2
Three services, zero volumes, one Docker image, zero external dependencies. The dashboard is already running.
Code: side by side
Defining a task
# Celery
from celery import Celery
app = Celery("myapp", broker="redis://redis:6379/0")
@app.task
def send_email(user_id: int) -> bool:
return True
# FastWorker
from fastworker import task
@task
def send_email(user_id: int) -> bool:
return True
Submitting a task
# Celery (from FastAPI — sync call)
send_email.delay(user_id)
# FastWorker (from FastAPI — async call)
await client.delay("send_email", user_id)
Retries with exponential backoff
# Celery
@app.task(bind=True, autoretry_for=(Exception,),
retry_backoff=True, max_retries=3)
def call_api(self, url):
return requests.get(url).json()
# FastWorker
@task
def call_api(url: str):
for attempt in range(3):
try:
return requests.get(url, timeout=5).json()
except Exception:
time.sleep(2 ** attempt)
raise
Celery’s decorator is more ergonomic. FastWorker’s retry is explicit but not framework magic.
Where Celery wins
- Workflows. Celery Canvas (chains, groups, chords) is a genuine feature you can’t reproduce in FastWorker. If you build complex DAGs of tasks, stay on Celery or use Temporal / Prefect.
- Persistence. Celery with a proper broker survives process crashes with tasks still queued. FastWorker loses queued tasks if the control plane dies.
- Scheduled tasks. Celery beat is a first-class scheduler. FastWorker expects you to use cron or Kubernetes CronJobs.
- Ecosystem. A decade of plugins, integrations, and Stack Overflow answers.
- Scale. Celery is routinely deployed at 100K+ tasks/minute. FastWorker is designed for 1K–10K tasks/minute.
Where FastWorker wins
- No broker. One less service to deploy, monitor, secure, back up, and patch.
- Built-in dashboard. Flower is fine, but it’s another service. FastWorker’s dashboard starts with the control plane.
- Async-native client. FastAPI’s whole appeal is async-first, and FastWorker keeps it that way. Celery’s async story is bolted on.
- Setup time.
pip install, start one process, done. There is no YAML. - Local development. No Docker required. No Redis required.
- Operational surface. One codebase (Python) to patch and update instead of Python + Redis + Erlang (if you use RabbitMQ).
Which to choose
Choose Celery if:
- You need task chains, groups, or complex workflow orchestration
- You need durable persistence of queued tasks across crashes
- You need
>10K tasks/minutesustained throughput - You rely on Celery beat for scheduling
- Your team already runs Redis/RabbitMQ and knows it well
- You need exactly-once or at-least-once delivery semantics
Choose FastWorker if:
- Your stack is Python only
- You’re in the 1K–10K tasks/min range
- You want a dashboard that ships with the queue
- You want to cut operational surface area
- You don’t need chains, beat, or durable queues
- You’re using FastAPI and want an async-native client
Both are good tools. Pick the one that matches your constraints, not the one with more features on the box.
Next steps
Frequently asked questions
Is FastWorker a drop-in replacement for Celery?
No. It covers the common case (delay a function, get a result), but doesn't implement chains, groups, chords, or Celery beat. For 90% of simple Celery usage you can migrate. For workflow-heavy Celery, stay on Celery.
Is Celery going away?
No, and you shouldn't use FastWorker if you expect it to. Celery is a mature, broadly used project and will be around for years. FastWorker is a smaller-surface alternative for teams that want operational simplicity.
Can I run both at the same time?
Yes. They don't conflict — they're separate Python libraries. The common migration path is to run both side-by-side and move tasks one at a time.