FastWorker

Migrating from Celery to FastWorker

A practical, step-by-step migration guide from Celery + Redis to FastWorker — covering task definitions, client code, priorities, retries, and deployment.

Dipankar Sarkar · ·
celerymigrationfastworkerpython

Celery is a great task queue. It’s also a lot of infrastructure for teams that don’t need DAG workflows or 100K tasks per minute. This guide walks through migrating a typical Celery + Redis deployment to FastWorker without breaking anything in production.

Before you start

Be honest about whether you should migrate. Stay on Celery if you rely on any of these:

  • Chains, groups, chords, canvas workflows — FastWorker doesn’t have an equivalent.
  • Celery beat for complex scheduling — use cron or Kubernetes CronJobs with FastWorker.
  • Durable task persistence — FastWorker keeps tasks in memory. A crash loses queued work.
  • Exactly-once delivery — FastWorker provides at-most-once semantics.
  • Multi-language workers — FastWorker is Python-only.

If none of those apply, keep reading.

The before picture

A minimal Celery setup looks like this:

# celery_app.py
from celery import Celery

app = Celery(
    "myapp",
    broker="redis://redis:6379/0",
    backend="redis://redis:6379/1",
)

@app.task
def send_email(user_id: int, template: str) -> bool:
    ...
    return True

@app.task
def resize_image(path: str, width: int) -> str:
    ...
    return out_path
# api.py (FastAPI)
from fastapi import FastAPI
from celery_app import send_email, resize_image

app = FastAPI()

@app.post("/signup")
async def signup(user_id: int):
    send_email.delay(user_id, "welcome")
    return {"ok": True}

Deployment: your API, Celery workers (celery -A celery_app worker), a Redis container, optionally Flower. Four or five services.

The migration

Step 1 — Install FastWorker alongside Celery

pip install fastworker
# keep celery for now

Neither depends on the other. You can run both.

Step 2 — Rewrite your task module

Celery’s @app.task becomes FastWorker’s @task. Your function body doesn’t change.

# tasks.py
from fastworker import task

@task
def send_email(user_id: int, template: str) -> bool:
    ...
    return True

@task
def resize_image(path: str, width: int) -> str:
    ...
    return out_path

This file has no broker URL, no config object, no app factory. A task is a function.

Step 3 — Start the control plane next to your app

fastworker control-plane --task-modules tasks

One process. Includes the web dashboard at http://127.0.0.1:8080.

Step 4 — Update the client code

Celery’s send_email.delay(...) becomes FastWorker’s await client.delay("send_email", ...). Celery is synchronous by default; FastWorker’s client is async, which is actually what you want from inside a FastAPI handler.

# api.py
from fastapi import FastAPI
from fastworker import Client

app = FastAPI()
client = Client()

@app.on_event("startup")
async def _start():
    await client.start()

@app.on_event("shutdown")
async def _stop():
    client.stop()

@app.post("/signup")
async def signup(user_id: int):
    await client.delay("send_email", user_id, "welcome")
    return {"ok": True}

Step 5 — Translate priorities

Celery uses broker-level routing and numeric priorities. FastWorker uses four enum levels:

# Celery
send_email.apply_async(args=[user_id], priority=10)

# FastWorker
from fastworker.tasks.models import TaskPriority
await client.delay("send_email", user_id, priority=TaskPriority.CRITICAL)

Map your priorities:

Celery priorityFastWorker level
9–10CRITICAL
6–8HIGH
3–5NORMAL
0–2LOW

Step 6 — Replace Celery’s retry decorator

Celery has @app.task(bind=True, max_retries=3) and self.retry. FastWorker keeps retry logic in the task body:

import time

@task
def call_flaky_api(url: str) -> dict:
    last_err = None
    for attempt in range(4):
        try:
            return requests.get(url, timeout=5).json()
        except Exception as e:
            last_err = e
            time.sleep(2 ** attempt)
    raise last_err

It’s less magical, but it’s explicit, and it doesn’t depend on the framework knowing about retries.

Step 7 — Run both in parallel

Deploy FastWorker alongside Celery. Migrate one endpoint at a time. The control plane processes tasks even without subworkers, so a minimal deployment is one extra process.

Step 8 — Retire Celery

When every x.delay(...) call is now await client.delay("x", ...), and you’ve tested for a cycle, delete:

  • celery_app.py
  • The Celery workers from your compose/Kubernetes manifests
  • The Redis broker (if it was only for Celery)
  • Flower
  • celery from requirements.txt

What changes in your deployment

ComponentBeforeAfter
Web appFastAPIFastAPI (unchanged)
Task workersCelery worker processesFastWorker subworkers
BrokerRedisNone
Result backendRedis / DBControl plane (in-memory LRU)
DashboardFlower (separate)Built-in at port 8080
Services in docker-compose4–52–3

What changes in your mental model

  • Tasks are registered by importing a module, not by a Celery app instance.
  • The “broker” is a Python process you run.
  • The “result backend” is an in-memory LRU cache with a 1-hour default TTL. If you need longer retention, store results in your own database inside the task.
  • Priorities are enums, not numbers.
  • Retries are a loop you write, not a decorator.
  • There are no chains, groups, or chords. Compose tasks by submitting new ones from inside tasks.

Common gotchas

  • Task arguments must be serializable. Same as Celery. Avoid passing huge blobs; pass a path or ID and read from storage inside the task.
  • The control plane’s result cache expires. Default is 1 hour. For long-lived results, persist to your database inside the task.
  • No beat schedule. If you were using Celery beat, move scheduled jobs to cron, Kubernetes CronJobs, or an in-process scheduler like APScheduler.

Next steps

Frequently asked questions

Should I migrate everything at once?

No. Run FastWorker and Celery side by side. Migrate one task module at a time, starting with simple fire-and-forget tasks. Keep Celery for anything using chains, chords, groups, or beat schedules until you've designed equivalents.

What about scheduled tasks (Celery beat)?

FastWorker doesn't currently have a built-in scheduler. Use cron, Kubernetes CronJobs, or APScheduler to invoke client.delay() on a schedule. For most services this is simpler than Celery beat anyway.

Do I still need Redis after migrating?

Not for task queueing. You may keep Redis for other uses (session cache, rate limiting), but FastWorker itself needs no external services.