Setting Up Redis And Celery To Work With Django On Windows (Asynchronous Execution)

Archit Narain G
4 min readJan 16, 2022

--

Hello! In this article I will be helping you to set up Redis and Celery so that you can make your Django App run asynchronously.

Introduction

Before we get into the steps, it is important to know what Redis and Celery do. Celery is a task-queue. While working with Celery, Redis acts as a message-broker which stores messages (created by the application) describing the work to be done in the Celery task queue.

Project

For this article, I will be scraping a table from a website and storing it as a csv. Celery tasks are usually used for email generation, report generation or long-running tasks.

Installation (Redis)

So First, We will set up Redis. To set up Redis click on the following link- Redis. Once you click on link, a .msi file will be downloaded. Once it is installed, open the file and follow the steps given by the set up wizard. But make sure you make note of the installation location if you are changing it, the default location is in C:(or any other drive)\Program Files\Redis.

Now to run the server, just redirect to the install location of Redis through the command prompt and run -

redis-server redis.windows.conf

After this Redis should be running. Now if you see errors such as
“Creating Server TCP listening socket <port>: bind: No error”

Then run this;

redis-cli.exe<port> shutdown
<not connected>

after you run shutdown, press ctrl + c and exit the cli. Now try running the server again and it should be fine.

Installation (Celery)

Installing celery is pretty simple as long as you have Python and PIP set up. So to install Celery run;

pip install celery

and it should be installed

Setting Up Celery In Django

To set up Celery in Django, follow the given steps:

Step 1: Make a file called celery.py in the place where settings.py is located and paste the below code into it. Save celery.py once you are done.

from __future__ import absolute_import
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '<app_name>.settings')
app = Celery(<app_name>)
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

Note: You will have to replace <app_name> with the name of the folder in which settings.py is located.

Step 2: Paste the below code at the end of the settings.py file:

# CELERY
BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True

Step 3: Open the __init__.py located in the same place where you have made celery.py, Inside __init__.py, paste the following code:

from __future__ import absolute_import# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)

Step 4: Make a file called tasks.py where views.py is located. This is the file in which we will be adding the functions that we wish to make asynchronous. In our case, this is where we will be fetching a table from a website and storing it as a csv.

#tasks.pyfrom __future__ import absolute_import
from bs4 import BeautifulSoup
import pandas as pd
import json
from celery import shared_task
import numpy as np
import requests
@shared_task
def gen_csv():
url = "https://en.wikipedia.org/wiki/List_of_countries_and_dependencies_by_population"
html_doc = requests.get(url).text
soup = BeautifulSoup(html_doc, 'html.parser')
tables = soup.find_all('table')
list_of_countries = pd.DataFrame(columns=['Country', 'Region', 'Population', 'Percentage'])
for row in tables[0].tbody.find_all("tr"):
col = row.find_all('td')
if (col != []):
country = col[0].text
region = col[1].text
population = col[2].text
per_of_world = col[3].text
list_of_countries = list_of_countries.append({'Country': country, 'Region': region, 'Population': population, 'Percentage': per_of_world}, ignore_index=True)
list_of_countries.to_csv('test.csv')

Step 5: views.py is the place where we will be calling the function we just made, so open views.py and import the task we made above.

from <app_name>.tasks import gen_csv
from django.http import HttpResponse

Above instead of <app_name>, use the name of the folder in which views.py is located.

Step 6: To call the function. use the below code;

def main_function(request):

gen_csv.delay()
response_data = {"Your CSV is being generated"}
return HttpResponse(response_data, content_type='application/json')

Testing

First run the redis-server command on the command prompt and start the server.

Then run manage.py runserver in the project folder

Finally, In the project folder and run :

celery --app=<app_name> worker --loglevel=INFO

Note: Replace <app_name> with the name of the folder in which settings.py is located

Then open 127.0.0.1:8000 and run the application. When the URL is called, the user should only be able to see “Your CSV is being generated” but in the background. the file gets made and saved.

What Users see

Let’s look at the table generated

Looks Good. We have successfully made our task asynchronous. I have stored this project on GitHub; if you wish to download it, then here is the link. Download the .zip file and extract it. All the changes have been made an it is asynchronous.

Thanks For Reading

--

--