Django celery implements asynchronous task operations and runs the of daemon in the background

  • 2021-10-11 18:54:13
  • OfStack

No nonsense, just go to the code.

Environmental description:

python3.6

django2.0.5

We use redis as the celery task queue, there is a composite package that can be installed directly for both of them, and 1 installation package is needed for use

Type directly at the terminal


pip install celery-with-redis

You can install the dependency packages you need

Skip the process of building the project and start the celery configuration directly

1. celery configuration.

Our project name is myproject, first setting configuration, add


# celery settings
# celery Middleman  redis://redis Where the service is located ip Address : Port / Database number 
BROKER_URL = 'redis://localhost:6379/3'
# celery Results return, which can be used to track results 
CELERY_RESULT_BACKEND = 'redis://localhost:6379/3'
 
# celery Formatting messages such as content 
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
# celery Time zone settings, using settings Medium TIME_ZONE Same time zone 
CELERY_TIMEZONE = TIME_ZONE

celery is then initialized by creating celery. py in the sibling directory of PATH/myproject/myproject/ that is, setting.


from __future__ import absolute_import, unicode_literals
 
from celery import Celery
from django.conf import settings
import os
 
#  Gets the current folder name, which is the Django Project name of 
project_name = os.path.split(os.path.abspath('.'))[-1]
project_settings = '%s.settings' % project_name
 
#  Setting Environment Variables 
os.environ.setdefault('DJANGO_SETTINGS_MODULE', project_settings)
 
#  Instantiation Celery
app = Celery(project_name)
 
#  Use django Adj. settings File configuration celery
app.config_from_object('django.conf:settings')
 
# Celery Load all registered applications 
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

Here, the input in the first line cannot be changed, only in the first line, otherwise an error will be reported.

The app that instantiates celery here we import elsewhere, and for easy import we put it in __init__.py, so we add in /myproject/myproject/__init__.py


from __future__ import absolute_import, unicode_literals
 
#  Introduce celery Instance object 
from .celery import app as celery_app

This also notifies django celery. py of the existence of the file.

2. Decorate our asynchronous functions with celery.

We create the celery_tasks module under the project root, that is, under PATH/myproject/, and then create tasks.py under that module, writing in our time-consuming program.


from myproject import celery_app
import time
 
@celery_app.task
def time_consuming_fun():
  for i in range(5):
    time.sleep(1)
    print(i)
  return 'ok'

Use our task method under celery_app directly to decorate the functions that need asynchronous processing.

3. Call an asynchronous function.

Called in view, using the class view of Django.


from celery_tasks.tasks import time_consuming_fun
from django.views import View
from django.http import JsonResponse
 
# Create your views here.
 
class MyView(View):
  def get(self,request):
    # Asynchronous invocation 
    time_consuming_fun.delay()
    # Direct call 
    #time_consuming_fun()
    return JsonResponse({'msg':'ok','code':200})

Configure url.

4. Start celery.

Under the project root directory, that is, under the managy sibling file directory, enter the command:

celery -A myproject worker -l info

At this point, celery runs in the terminal window, and closing the terminal celery will stop.

Input command


celery multi start w1 -A myproject -l info --logfile = celerylog.log --pidfile = celerypid.pid

At this point, celery is the daemon, and the log is recorded in celerylog. log.

The log file can specify the path PATH/celerylog. log, where the log file is created. The process number file is similar.

Stopping or restarting will change the start to stop or restart, so you need to record w1, that is, you need to record the name of woker to facilitate restarting and stopping.

Supplement: Django project does not hang up in the background

Method 1:

1. Enter the project directory and run the following program:


nohup python manage.py runserver 0.0.0.0:5008 &

nohup (no hang up) Purpose: Run command without hanging up

& Purpose: Run in the background


nohup /root/start.sh &

Prompt after carriage return in shell:


[~]$ appending output to nohup.out  

The standard output of the original program is automatically changed to the nohup. out file in the current directory, which plays the role of log.

Note: After the successful execution of nohup, clicking the close program button directly to close the terminal will break the session corresponding to this command, resulting in the process corresponding to nohup being notified of shutdown. Therefore, after using nohup command to run the command in the background, you need to use exit to exit the current account normally, so as to ensure that Command 1 runs directly in the background.

Method 2: This is more advanced, using screen

1. Install screen


# celery settings
# celery Middleman  redis://redis Where the service is located ip Address : Port / Database number 
BROKER_URL = 'redis://localhost:6379/3'
# celery Results return, which can be used to track results 
CELERY_RESULT_BACKEND = 'redis://localhost:6379/3'
 
# celery Formatting messages such as content 
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
# celery Time zone settings, using settings Medium TIME_ZONE Same time zone 
CELERY_TIMEZONE = TIME_ZONE
0

2. Create a new screen


# celery settings
# celery Middleman  redis://redis Where the service is located ip Address : Port / Database number 
BROKER_URL = 'redis://localhost:6379/3'
# celery Results return, which can be used to track results 
CELERY_RESULT_BACKEND = 'redis://localhost:6379/3'
 
# celery Formatting messages such as content 
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
# celery Time zone settings, using settings Medium TIME_ZONE Same time zone 
CELERY_TIMEZONE = TIME_ZONE
1

This will open a new window, and then execute the command


# celery settings
# celery Middleman  redis://redis Where the service is located ip Address : Port / Database number 
BROKER_URL = 'redis://localhost:6379/3'
# celery Results return, which can be used to track results 
CELERY_RESULT_BACKEND = 'redis://localhost:6379/3'
 
# celery Formatting messages such as content 
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
# celery Time zone settings, using settings Medium TIME_ZONE Same time zone 
CELERY_TIMEZONE = TIME_ZONE
2

3. Reopen a window to list all screen processes, as follows


[root@docker ~]# screen -ls
There are screens on:
    3029.xiedi  (Attached)

4. If you want to link to this session, just execute the command


# celery settings
# celery Middleman  redis://redis Where the service is located ip Address : Port / Database number 
BROKER_URL = 'redis://localhost:6379/3'
# celery Results return, which can be used to track results 
CELERY_RESULT_BACKEND = 'redis://localhost:6379/3'
 
# celery Formatting messages such as content 
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
 
# celery Time zone settings, using settings Medium TIME_ZONE Same time zone 
CELERY_TIMEZONE = TIME_ZONE
4

Related articles: