Method Steps for Using Celery in Django

  • 2021-08-17 00:32:41
  • OfStack

(1) Overview

Celery is a simple, flexible, and reliable multi-task-based distributed system that provides operations with the tools to maintain this system. Task queue focusing on real-time processing, and also supporting task scheduling. The unit of execution is the task (task), which can be run concurrently on a single or multiple jobs (worker) using multithreading.

Celery communicates through a messaging mechanism, typically through a man-in-the-middle (broker) to distribute and regulate communication between the client and the career server (worker). The client sends a message, and the middleman assigns the message to a career, and finally the career is responsible for performing this task.

Celery can have multiple careers and middlemen, which improves high availability and horizontal scalability

Celery is developed by python, but the protocol can be implemented in any language, such as Celery in Django, node-celery in node, and celery-php in php

(2) Process and configuration of Celery in Django

Import Celery: pip3 install Celery

Create the celery. py file in the directory with the same name as the project, especially note that the directory with the same name of the project

Copy content to the file

Modify two contents

proj in os. environ. setdefault ('DJANGO_SETTINGS_MODULE', 'proj. settings') is changed to project name app = pro in Celery ('pro') is changed to project name

import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

app = Celery('pro')

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#  should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
  print(f'Request: {self.request!r}')

Add content in the __init__.py file under the directory with the same name as the project


# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)

Add configuration in settings. py file

CELERY_BROKER_URL: Man-in-the-middle url, redis or RabbitMQ can be configured CELERY_RESULT_BACKEND: The storage address of the returned result CELERY_ACCEPT_CONTENT: The format of the received content is divided into two types: json and msgpack. Compared with json, msgpack has smaller data volume and faster transmission speed. CELERY_TASK_SERIALIZER: Serialization of task payload-- > json CELERY_TIMEZONE CELERY_TASK_TRACK_STARTED: Do you want to turn on task tracing CELERY_TASK_TIME_LIMIT: Task timeout limit

# Celery Configure 
CELERY_BROKER_URL = env("CELERY_BROKER_URL")
CELERY_RESULT_BACKEND = env("CELERY_RESULT_BACKEND")
CELERY_ACCEPT_CONTENT = ["json", "msgpack"]
CELERY_TASK_SERIALIZER = "json"
CELERY_TIMEZONE = "Asia/Shanghai"
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60

Create the tasks. py file under app and create the function of sending messages. The task method must add a decorator: @ shared_task


from rest_framework.response import Response
from rest_framework.generics import GenericAPIView
from time import sleep
from celery import shared_task

class TestView3(GenericAPIView):

  @classmethod
  @shared_task
  def sleep(self, duration):
    sleep(duration)
    return Response(" Success ", status=200)

Create views and routes


### views.py
from .tasks import TestView3
class TestView1(GenericAPIView):
  def get(self, request):
    TestView3.sleep(10)
    return Response("celery Successful experiment ")
test_view_1 = TestView1.as_view()

### urls.py
from django.urls import path
from .views import (
  test_view_1
)

urlpatterns = [
  path('celery/', test_view_1, name="test1")
]

Install redis and start

Start the django project

Start the Celery service using the Celery command, command: celery-A project name worker-l info, which is successful if the following is shown.


celery@AppledeMacBook-Air.local v5.0.3 (singularity)

Darwin-20.1.0-x86_64-i386-64bit 2020-12-05 20:52:17

[config]
.> app:     drf_email_project:0x7f84a0c4ad68
.> transport:  redis://127.0.0.1:6379/1%20
.> results:   redis://127.0.0.1:6379/2
.> concurrency: 4 (prefork)
.> task events: OFF (enable -E to monitor tasks in this worker)

[queues]
.> celery      exchange=celery(direct) key=celery


[tasks]
 . drf_email_project.celery.debug_task
 . users.tasks.sleep

[2020-12-05 20:52:18,166: INFO/MainProcess] Connected to redis://127.0.0.1:6379/1%20
[2020-12-05 20:52:18,179: INFO/MainProcess] mingle: searching for neighbors
[2020-12-05 20:52:19,212: INFO/MainProcess] mingle: all alone
[2020-12-05 20:52:19,248: WARNING/MainProcess] /Users/apple/drf-email/lib/python3.7/site-packages/celery/fixups/django.py:204: UserWarning: Using settings.DEBUG leads to a memory
      leak, never use this setting in production environments!
 leak, never use this setting in production environments!''')

[2020-12-05 20:52:19,249: INFO/MainProces

Related articles: