I am trying to code some periodic background tasks within Django and Celery.
My tasks are working well, I manage to start those periodically when launching my celery worker with beat scheduler :
celery -A parifex worker -B -l info
Discovered tasks by the worker
As you can see in the above image, my tasks are well discovered. However, the autodiscover on INSTALLED_APPS was not working as expected, so I had to include the path to the app where I want to find tasks in Celery instantiation.
Requirements
pip freeze |grep "celery\|Django"
celery==4.4.1
Django==3.1.2
django-celery==3.3.1
Into the base app where settings stands:
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'parifex.settings')
# Define the Celery configuration and include <app>.tasks to discover it
app = Celery('parifex', include=['device.tasks'])
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, force=True)
@app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
settings.py
djcelery.setup_loader()
CELERYD_HIJACK_ROOT_LOGGER = False
# http://celery.readthedocs.org/en/latest/configuration.html#celery-redirect-stdouts-level
CELERY_REDIRECT_STDOUTS = True # par défaut
CELERY_REDIRECT_STDOUTS_LEVEL = 'DEBUG'
CELERY_BROKER_URL = 'redis://:[email protected]:6234'
# store schedule in the DB:
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_RESULT_BACKEND = 'redis://:[email protected]:6234'
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json'] # Ignore other content
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/Paris'
CELERY_ENABLE_UTC = True
CELERY_TRACK_STARTED = True
CELERYD_POOL_RESTARTS = True
Into a module app called Device
tasks.py
from celery import shared_task
from celery.schedules import crontab
from celery.task import periodic_task
from django.core.serializers import serialize
from djcelery.schedulers import DatabaseScheduler, ModelEntry
from .models import Device
from .api.tools import TestDeviceApi
@shared_task
def count_devices():
return Device.objects.count()
@shared_task(serializer='json', bind=True)
def get_devices_state():
device_status = {}
print(serialize('json', Device.objects.all()))
for device in Device.objects.all():
status = TestDeviceApi(device).ping()
device_status[device.id] = (serialize('json', [device]), status)
return device_status
@periodic_task(run_every=(crontab(minute='*/1')))
def print_random():
print(15)
Well, as I said, my worker discovers tasks fine, but my admin page within Django does not see it.
First question : How to register my tasks in order to see it in admin page, when creating periodic tasks by its page ? Or it is useless ? Registered Tasks my admin page can see
Second question : Is it possible to see hardcoded periodic tasks within admin page without any other module to install ? Is there a 'built-in' way to do this ? Or perhaps, may I have to try saving it on my own, each time the task is called ?
My goal is to let the user capability to change scheduling on his own on tasks with a default behavior.
Thanks in advance for your help ! I hope I am clear.
I was having the same problem.
I fixed it by adding these lines to proj/proj/__init__.py
Unsure if this is the best solution or just a work around but apparently this will make sure the app is imported when Django starts for shared_task to use.