Introduction
Celery is the standard solution for running background tasks in Python applications. When your web app needs to send emails, process images, generate reports, or run ML inference without blocking the HTTP response, Celery handles it. Tasks are sent to a message broker (Redis, RabbitMQ) and executed by worker processes.
With over 28,000 GitHub stars, Celery is used by Instagram, Mozilla, Adroll, and thousands of Django/Flask applications. It processes billions of tasks daily across the Python ecosystem.
What Celery Does
Celery distributes work across multiple worker processes or machines. You define tasks as Python functions with the @task decorator, call them asynchronously with .delay() or .apply_async(), and workers pick them up from the message broker. Results can be stored in Redis, databases, or other backends.
Architecture Overview
[Web Application]
Django, Flask, FastAPI
|
task.delay(args)
|
[Message Broker]
+-------+-------+
| |
[Redis] [RabbitMQ]
Simple, Feature-rich,
fast reliable
|
[Celery Workers]
Multiple processes/machines
Concurrency: prefork, eventlet, gevent
|
[Task Execution]
Retries, rate limits,
priority, routing
|
[Result Backend]
Redis, PostgreSQL,
Django ORM, MongoDBSelf-Hosting & Configuration
# celery_app.py — production configuration
from celery import Celery
from celery.schedules import crontab
app = Celery("myproject")
app.config_from_object({
"broker_url": "redis://localhost:6379/0",
"result_backend": "redis://localhost:6379/1",
"task_serializer": "json",
"result_serializer": "json",
"accept_content": ["json"],
"task_track_started": True,
"task_time_limit": 300, # 5 min hard limit
"task_soft_time_limit": 240, # 4 min soft limit
"worker_prefetch_multiplier": 1,
"task_acks_late": True, # ack after execution
})
# Periodic tasks (Celery Beat)
app.conf.beat_schedule = {
"cleanup-every-hour": {
"task": "tasks.cleanup_old_records",
"schedule": crontab(minute=0),
},
"daily-report": {
"task": "tasks.generate_daily_report",
"schedule": crontab(hour=6, minute=0),
},
}
@app.task(bind=True, max_retries=3, default_retry_delay=60)
def process_order(self, order_id):
try:
order = get_order(order_id)
charge_payment(order)
send_confirmation(order)
return {"status": "completed", "order_id": order_id}
except PaymentError as exc:
raise self.retry(exc=exc)# Start worker with concurrency
celery -A celery_app worker --concurrency=8 --loglevel=info
# Start beat scheduler
celery -A celery_app beat --loglevel=info
# Monitor with Flower
pip install flower
celery -A celery_app flower --port=5555Key Features
- Async Tasks — run functions in background workers via .delay()
- Scheduled Tasks — cron-like periodic task execution (Celery Beat)
- Retries — automatic retry with exponential backoff
- Rate Limiting — control task execution rate per worker or globally
- Task Routing — send specific tasks to specific worker queues
- Monitoring — Flower web UI for real-time worker monitoring
- Canvas — chain, group, chord for complex workflow patterns
- Multiple Brokers — Redis, RabbitMQ, Amazon SQS
Comparison with Similar Tools
| Feature | Celery | RQ (Redis Queue) | Dramatiq | Huey | Temporal |
|---|---|---|---|---|---|
| Complexity | Moderate | Very Low | Low | Very Low | High |
| Brokers | Redis, RabbitMQ, SQS | Redis only | Redis, RabbitMQ | Redis, SQLite | Custom |
| Periodic Tasks | Celery Beat | rq-scheduler | APScheduler | Built-in | Built-in |
| Monitoring | Flower | rq-dashboard | Built-in | Minimal | Built-in UI |
| Canvas/Workflows | Yes | No | Middleware | Minimal | Core feature |
| Scale | Very Large | Medium | Large | Small | Very Large |
| Best For | Production Python | Simple jobs | Modern Python | Simple apps | Microservices |
FAQ
Q: Redis or RabbitMQ as broker? A: Redis for simplicity and when you already use Redis. RabbitMQ for reliability guarantees, complex routing, and when task loss is unacceptable. Redis is the more common choice.
Q: How do I handle task failures? A: Use max_retries and default_retry_delay for automatic retries. Use task_acks_late=True so tasks return to the queue if a worker crashes. Monitor failures with Flower or Sentry.
Q: Can Celery handle millions of tasks? A: Yes. Instagram processes billions of Celery tasks. Scale horizontally by adding more workers. Use task routing to dedicate workers to different queues.
Q: Celery vs Temporal — which should I use? A: Celery for straightforward background tasks in Python apps. Temporal for complex, long-running workflows that span multiple services with state management and compensation logic.
Sources
- GitHub: https://github.com/celery/celery
- Documentation: https://docs.celeryq.dev
- Created by Ask Solem
- License: BSD-3-Clause