Backend Frameworks
celery

Celery

PythonOpen SourceSelf-hosted

The most widely used distributed task queue for Python. Celery handles background jobs, scheduled tasks, and async processing using Redis or RabbitMQ as a broker. A staple of Django and Flask applications.

License

BSD

Language

Python

87
Trust
Strong

Why Celery?

You need to run background tasks in Python (sending emails, processing files)

You need scheduled / cron-style tasks in a Python app

You're already using Django or Flask and need async job processing

Signal Breakdown

What drives the Trust Score

PyPI downloads
7.2M / mo
Commits (90d)
28 commits
GitHub stars
24k ★
Stack Overflow
31k q's
Community
Medium
Weighted Trust Score87 / 100

Download Trend

Last 12 months

Tradeoffs & Caveats

Know before you commit

You're not using Python — BullMQ (Node.js) or Sidekiq (Ruby) are better fits

You want simple task queuing — RQ (Redis Queue) is much simpler

Commit frequency has slowed — evaluate Dramatiq or ARQ as alternatives

Pricing

Free tier & paid plans

Free tier

100% free, open-source (BSD)

Paid

Free & open-source

Message broker (Redis/RabbitMQ) costs separately

Often Used Together

Complementary tools that pair well with Celery

redis

Redis

Database & Cache

93Excellent
View
fastapi

FastAPI

Backend Frameworks

97Excellent
View
docker

Docker

DevOps & Infra

93Excellent
View
airflow

Apache Airflow

Data Engineering

93Excellent
View
kafka

Apache Kafka

Data Engineering

92Excellent
View

Learning Resources

Docs, videos, tutorials, and courses

Get Started

Repository and installation options

View on GitHub

github.com/celery/celery

pippip install celery

Quick Start

Copy and adapt to get going fast

from celery import Celery

app = Celery('tasks', broker='redis://localhost:6379/0', backend='redis://localhost:6379/1')

@app.task
def send_welcome_email(user_id: int):
    user = User.objects.get(id=user_id)
    email_service.send_welcome(user.email)

@app.task
def process_upload(file_path: str):
    return run_etl_pipeline(file_path)

# Trigger from your view:
send_welcome_email.delay(user.id)

# Run worker:
# celery -A tasks worker --loglevel=info

Code Examples

Common usage patterns

Periodic tasks (beat scheduler)

Schedule recurring tasks like cron jobs

from celery import Celery
from celery.schedules import crontab

app = Celery('tasks', broker='redis://localhost:6379/0')

app.conf.beat_schedule = {
    'daily-digest': {
        'task': 'tasks.send_daily_digest',
        'schedule': crontab(hour=9, minute=0),
    },
    'cleanup-every-hour': {
        'task': 'tasks.cleanup_temp_files',
        'schedule': 3600.0,
    },
}

# Run the beat scheduler:
# celery -A tasks beat --loglevel=info

Task chaining and groups

Compose tasks into workflows

from celery import chain, group, chord

# Chain: run tasks sequentially
pipeline = chain(
    extract_data.s(source_id),
    transform_data.s(),
    load_to_warehouse.s(),
)
result = pipeline.apply_async()

# Group: run tasks in parallel
jobs = group(process_file.s(f) for f in file_list)
results = jobs.apply_async()

Retry on failure

Automatically retry a task with exponential backoff

from celery import shared_task
from celery.exceptions import MaxRetriesExceededError
import requests

@shared_task(bind=True, max_retries=3, default_retry_delay=60)
def fetch_external_data(self, url: str):
    try:
        response = requests.get(url, timeout=10)
        response.raise_for_status()
        return response.json()
    except requests.RequestException as exc:
        raise self.retry(exc=exc, countdown=2 ** self.request.retries * 60)

Community Notes

Real experiences from developers who've used this tool