ScriptsApr 13, 2026·3 min read

Celery — Distributed Task Queue for Python

Celery is the most popular distributed task queue for Python. It processes millions of tasks per day at companies worldwide, handling background jobs, scheduled tasks, and real-time processing with support for multiple message brokers and result backends.

SC
Script Depot · Community
Quick Use

Use it first, then decide how deep to go

This block should tell both the user and the agent what to copy, install, and apply first.

# Install Celery with Redis broker
pip install celery[redis]

# Create tasks.py
cat > tasks.py << 'PYEOF'
from celery import Celery

app = Celery("tasks", broker="redis://localhost:6379/0", backend="redis://localhost:6379/1")

@app.task
def add(x, y):
    return x + y

@app.task
def send_email(to, subject, body):
    # simulate email sending
    import time; time.sleep(2)
    return f"Email sent to {to}"
PYEOF

# Start a worker
celery -A tasks worker --loglevel=info

# Call tasks from Python
# >>> from tasks import add, send_email
# >>> result = add.delay(4, 6)
# >>> result.get()  # 10
# >>> send_email.delay("user@example.com", "Hello", "World")

Introduction

Celery is the standard solution for running background tasks in Python applications. When your web app needs to send emails, process images, generate reports, or run ML inference without blocking the HTTP response, Celery handles it. Tasks are sent to a message broker (Redis, RabbitMQ) and executed by worker processes.

With over 28,000 GitHub stars, Celery is used by Instagram, Mozilla, Adroll, and thousands of Django/Flask applications. It processes billions of tasks daily across the Python ecosystem.

What Celery Does

Celery distributes work across multiple worker processes or machines. You define tasks as Python functions with the @task decorator, call them asynchronously with .delay() or .apply_async(), and workers pick them up from the message broker. Results can be stored in Redis, databases, or other backends.

Architecture Overview

[Web Application]
Django, Flask, FastAPI
        |
   task.delay(args)
        |
   [Message Broker]
+-------+-------+
|               |
[Redis]      [RabbitMQ]
Simple,      Feature-rich,
fast         reliable
        |
   [Celery Workers]
   Multiple processes/machines
   Concurrency: prefork, eventlet, gevent
        |
   [Task Execution]
   Retries, rate limits,
   priority, routing
        |
   [Result Backend]
   Redis, PostgreSQL,
   Django ORM, MongoDB

Self-Hosting & Configuration

# celery_app.py — production configuration
from celery import Celery
from celery.schedules import crontab

app = Celery("myproject")

app.config_from_object({
    "broker_url": "redis://localhost:6379/0",
    "result_backend": "redis://localhost:6379/1",
    "task_serializer": "json",
    "result_serializer": "json",
    "accept_content": ["json"],
    "task_track_started": True,
    "task_time_limit": 300,  # 5 min hard limit
    "task_soft_time_limit": 240,  # 4 min soft limit
    "worker_prefetch_multiplier": 1,
    "task_acks_late": True,  # ack after execution
})

# Periodic tasks (Celery Beat)
app.conf.beat_schedule = {
    "cleanup-every-hour": {
        "task": "tasks.cleanup_old_records",
        "schedule": crontab(minute=0),
    },
    "daily-report": {
        "task": "tasks.generate_daily_report",
        "schedule": crontab(hour=6, minute=0),
    },
}

@app.task(bind=True, max_retries=3, default_retry_delay=60)
def process_order(self, order_id):
    try:
        order = get_order(order_id)
        charge_payment(order)
        send_confirmation(order)
        return {"status": "completed", "order_id": order_id}
    except PaymentError as exc:
        raise self.retry(exc=exc)
# Start worker with concurrency
celery -A celery_app worker --concurrency=8 --loglevel=info

# Start beat scheduler
celery -A celery_app beat --loglevel=info

# Monitor with Flower
pip install flower
celery -A celery_app flower --port=5555

Key Features

  • Async Tasks — run functions in background workers via .delay()
  • Scheduled Tasks — cron-like periodic task execution (Celery Beat)
  • Retries — automatic retry with exponential backoff
  • Rate Limiting — control task execution rate per worker or globally
  • Task Routing — send specific tasks to specific worker queues
  • Monitoring — Flower web UI for real-time worker monitoring
  • Canvas — chain, group, chord for complex workflow patterns
  • Multiple Brokers — Redis, RabbitMQ, Amazon SQS

Comparison with Similar Tools

Feature Celery RQ (Redis Queue) Dramatiq Huey Temporal
Complexity Moderate Very Low Low Very Low High
Brokers Redis, RabbitMQ, SQS Redis only Redis, RabbitMQ Redis, SQLite Custom
Periodic Tasks Celery Beat rq-scheduler APScheduler Built-in Built-in
Monitoring Flower rq-dashboard Built-in Minimal Built-in UI
Canvas/Workflows Yes No Middleware Minimal Core feature
Scale Very Large Medium Large Small Very Large
Best For Production Python Simple jobs Modern Python Simple apps Microservices

FAQ

Q: Redis or RabbitMQ as broker? A: Redis for simplicity and when you already use Redis. RabbitMQ for reliability guarantees, complex routing, and when task loss is unacceptable. Redis is the more common choice.

Q: How do I handle task failures? A: Use max_retries and default_retry_delay for automatic retries. Use task_acks_late=True so tasks return to the queue if a worker crashes. Monitor failures with Flower or Sentry.

Q: Can Celery handle millions of tasks? A: Yes. Instagram processes billions of Celery tasks. Scale horizontally by adding more workers. Use task routing to dedicate workers to different queues.

Q: Celery vs Temporal — which should I use? A: Celery for straightforward background tasks in Python apps. Temporal for complex, long-running workflows that span multiple services with state management and compensation logic.

Sources

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets