I have created this custom management command in a Django application, to delete all but one record in a database table:
from django.core.management.base import BaseCommand
from tokenizer.models import OauthToken
class Command(BaseCommand):
help = 'Deletes all but the most recent oauthtoken'
def handle(self, *args, **options):
latest_token_id = OauthToken.objects.latest("gen_time").id
OauthToken.objects.exclude(id=latest_token_id).delete()
and it works as expected when run manually, like so:
python manage.py oauth_table_clearout
However, when I try and get a Celery Task to execute it, whilst the task appears to be picked up and succeeds, the records are not deleted from the db, and there are no obvious errors given.
I am running docker-compose like so:
version: '3.7'
services:
redis:
image: redis:alpine
django:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
env_file:
- ./.env
depends_on:
- redis
celery:
build: .
command: celery -A token_generator worker -l debug --without-gossip --without-mingle --without-heartbeat -Ofair --pool=solo
volumes:
- .:/usr/src/app/
depends_on:
- redis
env_file:
- ./.env
celery-beat:
build: .
command: celery -A token_generator beat -l debug
volumes:
- .:/usr/src/app/
depends_on:
- redis
env_file:
- ./.env
note that I have tried appending '--without-gossip --without-mingle --without-heartbeat -Ofair' to the worker command, (which seems to be what has solved this particular problem for everyone else!)
The logs look like:
celery-beat_1 | [2020-11-26 21:51:00,049: DEBUG/MainProcess] beat: Synchronizing schedule...
celery-beat_1 | [2020-11-26 21:51:00,056: INFO/MainProcess] Scheduler: Sending due task oauth_task (token_generator.tasks.oauth_db_clearout_task)
celery-beat_1 | [2020-11-26 21:51:00,065: DEBUG/MainProcess] token_generator.tasks.oauth_db_clearout_task sent. id->ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f
celery-beat_1 | [2020-11-26 21:51:00,067: DEBUG/MainProcess] beat: Waking up in 59.92 seconds.
celery_1 | [2020-11-26 21:51:00,070: INFO/MainProcess] Received task: token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f]
celery_1 | [2020-11-26 21:51:00,076: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x7f32013b3c10> (args:('token_generator.tasks.oauth_db_clearout_task', 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', {'lang': 'py', 'task': 'token_generator.tasks.oauth_db_clearout_task', 'id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': '{}', 'origin': 'gen1@328b6b324d84', 'reply_to': '0513ed80-806d-33c4-aa3f-83f942c27d0d', 'correlation_id': 'ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f', 'hostname': 'celery@6735220ff248', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': None}, 'args': [], 'kwargs': {}}, b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]', 'application/json', 'utf-8') kwargs:{})
celery_1 | [2020-11-26 21:51:00,077: DEBUG/MainProcess] Task accepted: token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f] pid:1
celery_1 | [2020-11-26 21:51:00,106: INFO/MainProcess] Task token_generator.tasks.oauth_db_clearout_task[ad97bbc7-0dcf-4a82-a97b-e2ce7dbd817f] succeeded in 0.028364189998683287s: None
The celery.py file in my app:
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "token_generator.settings")
app = Celery("token_generator")
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks()
and the celery related settings.py:
CELERY_BROKER_URL = "redis://redis:6379"
CELERY_RESULT_BACKEND = "redis://redis:6379"
CELERY_BEAT_SCHEDULE = {
"oauth_task": {
"task": "token_generator.tasks.oauth_db_clearout_task",
"schedule": crontab(minute="*/1"),
},
}
celery report:
software -> celery:5.0.2 (singularity) kombu:5.0.2 py:3.8.2
billiard:3.6.3.0 py-amqp:5.0.2
platform -> system:Linux arch:64bit
kernel version:5.4.0-53-generic imp:CPython
loader -> celery.loaders.default.Loader
settings -> transport:amqp results:disabled
deprecated_settings: None
and Django is 3.1.3
I have discovered the answer myself. Each container was getting its own copy of the sqlite db. The commands were actually being executed, but only on the copy of the db in the celery container. The db my IDE was inspecting, was in a different container and therefore untouched.
I added an extra postgres service to my docker-compose configuration and added a dockerignore file, to not copy over the sqlite db.