arlind.dev
Python, Software Engineering

Job queues with Django, Celery & Redis

Job queues are an essential part of modern web applications as they allow for the execution of long-running or resource-intensive tasks asynchronously. Django, being a popular web framework, has several libraries that make it easy to implement job queues in your application. In this tutorial, we will be using Celery and Redis as our job queue and message broker, respectively.

Step 1: Install Required Libraries

The first step is to install the required libraries using pip. In this case, we need Celery and Redis.

pip install celery redis

Step 2: Configure Celery

Create a file named celery.py in your Django project root directory and add the following code:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'yourproject.settings')

app = Celery('yourproject')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()

Step 3: Configure Redis

Make sure Redis is installed and running on your system. In your Django project’s settings.py file, add the following code to configure Redis as the message broker:

CELERY_BROKER_URL = 'redis://localhost:6379'

Step 4: Create a Task

Create a file named tasks.py in your Django app directory and add the following code:

from __future__ import absolute_import, unicode_literals
from celery import shared_task

@shared_task
def process_data(data):
    # Do something with the data
    return 'Processed data'

Step 5: Trigger the Task

To trigger the task, create a view in your Django app that accepts a POST request and calls the Celery task.

from django.http import JsonResponse
from django.views.decorators.csrf import csrf_exempt
from .tasks import process_data

@csrf_exempt
def process_data_view(request):
    if request.method == 'POST':
        data = request.POST.get('data')
        result = process_data.delay(data)
        return JsonResponse({'task_id': result.id})

Step 6: Monitor the Task

To monitor the task, you can create another view that accepts a GET request and returns the status of the task.

from django.http import JsonResponse
from .tasks import process_data

def task_status_view(request, task_id):
    result = process_data.AsyncResult(task_id)
    if result.ready():
        return JsonResponse({'status': 'done', 'result': result.get()})
    else:
        return JsonResponse({'status': 'pending'})

Step 7: Start the Celery Worker

Finally, start the Celery worker by running the following command in your terminal:

celery -A yourproject worker -l info

That’s it! You now have a working job queue in your Django application. When you trigger the process_data_view view, it will add the task to the Celery queue, and the process_data task will be executed asynchronously. You can then monitor the status of the task using the task_status_view.

Related posts

Python & Golang comparison

Arlind Hoxha
2 years ago

Distribute a Python package with public PyPi

Arlind Hoxha
2 years ago

Platform Engineering & Developer Experience

Arlind Hoxha
2 years ago
Exit mobile version