Celery Mastering Background Tasks in Django & Python

Celery Mastering Background Tasks in Django & Python

If you're a backend developer working with Django or Python APIs, you’ve likely faced this problem:

“Why is my API slow when sending emails, processing reports, or calling third-party APIs?”

The answer? You’re doing heavy work inside the request-response cycle.

That’s where Celery comes in.

This blog series will take you from beginner to advanced level in using Celery with Django, Redis, and Docker.

What is Celery?

Celery is a distributed task queue system used to handle asynchronous and background tasks.

Instead of making users wait, Celery:

  • Executes tasks in the background

  • Improves API performance

  • Handles scheduled jobs

  • Scales horizontally

Real-World Example

Without Celery:

def create_user(request):
user = User.objects.create(...)
send_welcome_email(user.email) # Slow
return Response({"message": "User created"})

User waits until email is sent.

With Celery:

def create_user(request):
user = User.objects.create(...)
send_welcome_email.delay(user.email) # Background
return Response({"message": "User created"})

User gets instant response.

When Should You Use Celery?

✔ Sending emails
✔ Generating reports (PDF/Excel)
✔ Processing images/videos
✔ Calling third-party APIs
✔ Data syncing
✔ Scheduled tasks (cron jobs)

Setting Up Celery with Django & Redis

Celery needs:

  • A message broker

  • A result backend (optional)

Most common broker:Redis

Installation

pip install celery redis

Django Project Setup

Inside your Django project:

project/celery.py

import os
from celery import Celery

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")

app = Celery("project")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()

settings.py

CELERY_BROKER_URL = "redis://localhost:6379/0"
CELERY_ACCEPT_CONTENT = ["json"]
CELERY_TASK_SERIALIZER = "json"

Create Task

# app/tasks.py

from celery import shared_task

@shared_task
def add(x, y):
return x + y

Run Worker

celery -A project worker --loglevel=info

Celery Task States, Retries & Error Handling

Celery task states:

  • PENDING

  • STARTED

  • SUCCESS

  • FAILURE

  • RETRY

Retry Example

@shared_task(bind=True, max_retries=3)
def call_api(self):
try:
requests.get("https://example.com")
except Exception as exc:
raise self.retry(exc=exc, countdown=5)

Automatically retries 3 times.

Periodic Tasks with Celery Beat

Want scheduled jobs like cron?

Use:Celery Beat

Example

CELERY_BEAT_SCHEDULE = {
"send-daily-report": {
"task": "app.tasks.daily_report",
"schedule": 86400.0,
},
}

Run:

celery -A project beat

Use cases:

  • Daily email summary

  • Auto data cleanup

  • Scheduled backups

Celery + Docker + Redis Production Setup

In real projects, Celery runs in separate containers.

docker-compose.yml

services:
web:
build: .
command: gunicorn project.wsgi
depends_on:
- redis

redis:
image: redis:7

celery:
build: .
command: celery -A project worker --loglevel=info
depends_on:
- redis

Advanced Celery Concepts

1️⃣ Task Chaining

from celery import chain

chain(task1.s(), task2.s(), task3.s())()

2️⃣ Groups (Parallel Execution)

from celery import group

group(task.s(i) for i in range(10))()

3️⃣ Chords (Group + Callback)

from celery import chord

chord([task1.s(), task2.s()])(callback.s())

4️⃣ Rate Limiting

@shared_task(rate_limit="10/m")
def send_email():
pass

5️⃣ Task Time Limits

@shared_task(time_limit=30)
def long_running_task():
pass

Celery Performance Optimization

✅ Use JSON serializer

✅ Avoid large objects in task arguments

✅ Use Redis memory limits

✅ Scale workers horizontally

✅ Use separate queues

Multiple Queues Example

@shared_task(queue="emails")
def send_email():
pass

@shared_task(queue="reports")
def generate_report():
pass

Run worker for specific queue:

celery -A project worker -Q emails


Comments

Popular posts from this blog

Database Integration in FastAPI (SQLAlchemy CRUD)

Middleware & CORS in FastAPI

Python Data Handling