How to activate the process queue in "django-background-tasks"

10.5k views Asked by At

I am new to Django and the django-background-tasks package.

I am facing an issue that I couldn't do/start background task unless I forcefully run the command process_tasks , that is python manage.py process_tasks. I want to do/start background task without run the process_tasks command.

settings.py

MAX_ATTEMPTS=1
BACKGROUND_TASK_RUN_ASYNC = True

tasks.py

from background_task import background
#included necessary packages for SMTP
@background(schedule=5)
def test():
    #send mail to some ids

views.py

def index(request):
    test(schedule=5)
    return HttpResponse("Hello, world. ")

Ignore my logic.

7

There are 7 answers

0
Faiya On

Well you could make a bash script that makes the process_tasks call and call that script from python

to call the script from python you can use

import subprocess
subprocess.call("process_tasks.sh", shell=True)

as its explained here Running bash script from within python

your bash script may look something like this

cd your/virtualenv/directory
source virtualenv/bin/activate
exec /path/to/manage.py process_tasks

dont forget to make process_tasks.sh executable

0
Pragya Mittal On

python manage.py process_tasks &

gunicorn --workers=3 app.wsgi -b 0.0.0.0:8080

1
Jed On

You can call the manage.py command from directly in your django project using the below code:

from django.core.management import call_command

call_command('process_tasks')

check out the docs here

4
chadgh On

That is how django-background-tasks works. You have to run that process for the task runner to start processing tasks.

https://github.com/arteria/django-background-tasks#running-tasks

0
clg123 On

In case anybody else is trying to get this working - I call a shell script from my Dockerfile that then runs two commands. One to start the webserver (gunicorn or run_server) and one to run the manage.py process_tasks process. The important thing is to run the process_tasks process as a background task.

At the bottom of my Dockerfile I have: CMD ["./run_app.sh"]

And my run_app.sh file looks like:

#!/usr/bin/env bash

# start background tasks 
python manage.py process_tasks &

gunicorn --workers=3 app.wsgi -b 0.0.0.0:8080

Note the trailing & from the process tasks command. This enables the shell script to continue processing and run the gunicorn command.

Hope this helps someone.

0
addmoss On

You must run process_tasks command to execute tasks that have been scheduled. There is no other way. Process_tasks checks the db regularly for scheduled tasks and executes them in the background.

In a local environment

Open a terminal, cd into your app folder and launch the command python manage.py process_tasks. If you are using a virtualenv make sure you activate it first.

In a production environment

Option 1: Run a cron job

One solution is to launch the process_tasks command using a cron job. Beware that cron launches a new process at the scheduled interval without caring about the previous processes. You must make sure the process ends by the time the new cron call is scheduled. Otherwise you will be wasting resources on parallel instances of process_tasks. Happily this check is fairly easy to do.

To launch process_tasks for a limited time you can use its duration parameter. For example, to launch it with a life-span of 10 hours you call:

python manage.py process_tasks --duration 36000   

Configure the cron job: In cPanel you can use the Cron Jobs app. Configure the cron job to run hourly and add the following lines as your cron command:

cd /home/username/appname && /home/username/virtualenv/bin/python /home/username/appname/manage.py process_tasks --duration 3540 

This will cd into your app folder and launch the process_tasks command, taking into consideration your virtual environment and using a life-span of 59 minutes for the process_tasks command.

Update the paths to you app & virtual environment accordingly! Adjust the cron job interval and duration parameter to fit your needs!

Benefits:

  • If the command fails it will get restarted by the cron job later.
  • You don't have to worry about resource/memory leaks too much.

Option 2. Launch command at server startup

Basically you will have to create a script file containing the same launch command and configure your server to run it at startup.

You lose the benefit that your process is restarted from time to time so you will probably have to monitor its health state. If you are going down this path, a good approach is to use supervisord to monitor your process.

1
Asif Khan On

I am also trying to use Django Background Tasks and have done successfully everything and my task execute perfectly when run manually "python manage.py process_tasks" but my background task does not trigger as per @background(schedule=60) automatically without running "python manage.py process_tasks" or whatever I set schedule time. than what is the point of setting schedule time for a task. When task has to be executed with command python manage.py process_task is mandatory to trigger all background task.