ScriptsApr 13, 2026·3 min read

Celery — Distributed Task Queue for Python

Celery is the most popular distributed task queue for Python. It processes millions of tasks per day at companies worldwide, handling background jobs, scheduled tasks, and real-time processing with support for multiple message brokers and result backends.

TL;DR
Celery processes background jobs, scheduled tasks, and real-time workloads in Python with support for multiple message brokers.
§01

What it is

Celery is the most widely used distributed task queue for Python. It handles background job processing, scheduled periodic tasks, and real-time message processing with support for multiple message brokers (RabbitMQ, Redis, Amazon SQS) and result backends (Redis, PostgreSQL, MongoDB).

It targets Python developers who need to offload time-consuming work (sending emails, processing images, running ML inference, calling external APIs) from their web request cycle into asynchronous background workers.

§02

How it saves time or tokens

Without Celery, long-running operations block web requests, causing timeouts and poor user experience. Celery moves these operations to separate worker processes that execute independently. The web application returns immediately while the task runs in the background. Celery Beat handles periodic scheduling, replacing cron jobs with a more manageable, code-defined system.

§03

How to use

  1. Install Celery: pip install celery[redis] (with Redis as both broker and result backend).
  2. Define tasks as decorated functions with @app.task.
  3. Start a worker process with celery -A myapp worker and call tasks with .delay() or .apply_async().
§04

Example

# tasks.py
from celery import Celery

app = Celery('myapp', broker='redis://localhost:6379/0',
             backend='redis://localhost:6379/1')

@app.task
def send_email(to, subject, body):
    # Simulate email sending
    import time
    time.sleep(5)
    return f'Email sent to {to}'

# In your web application:
result = send_email.delay('user@example.com', 'Hello', 'World')
print(result.id)  # Task ID for tracking
print(result.get(timeout=30))  # Wait for result
§05

Related on TokRepo

§06

Common pitfalls

  • Celery tasks must be serializable. Avoid passing database connections, file handles, or other non-serializable objects as task arguments.
  • Memory leaks in long-running workers accumulate over time. Use the --max-tasks-per-child flag to restart workers after a set number of tasks.
  • Celery Beat (the periodic task scheduler) should run as a single instance. Running multiple Beat processes causes duplicate task execution.

Frequently Asked Questions

What message broker should I use with Celery?+

RabbitMQ is the recommended production broker. It provides reliable message delivery, supports acknowledgments, and handles high throughput. Redis works well for simpler setups and doubles as a result backend. Amazon SQS is suitable for AWS deployments.

How do I monitor Celery workers?+

Flower is the most popular Celery monitoring tool. It provides a web dashboard showing active workers, task history, success/failure rates, and resource usage. Install it with pip install flower and run it alongside your workers.

Can Celery handle periodic scheduled tasks?+

Yes. Celery Beat is the built-in periodic task scheduler. Define schedules in your Celery configuration using crontab expressions or timedelta intervals. Beat sends tasks to the queue at the specified times, and workers process them normally.

How does Celery handle task failures?+

Celery supports automatic retries with configurable delays and maximum retry counts. Use the retry_backoff and max_retries parameters on tasks. Failed tasks can also be routed to a dead-letter queue for manual inspection.

Is Celery suitable for real-time processing?+

Celery handles near-real-time processing well for tasks that take seconds to minutes. For sub-millisecond latency requirements, a message streaming platform like Kafka is more appropriate. Celery excels at background job processing and scheduled task execution.

Citations (3)

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets