I am using Celery 3.1.16 broker (running RabbitMQ) and multiple Celery workers with celeryd daemonized through supervisor. Problem is with tasks update. When I update my tasks.py file, celery worker runs old code.
Celery launch command:
/home/my_project/bin/celery -B --autoreload --app=my_app.celery:app worker --loglevel=INFO
I include tasks file in django settings.py:
CELERY_IMPORTS = [
'my_app.tasks'
]
pyinotify is installed and works (I guess so), part of celery logs:
[2014-12-16 20:56:00,016: INFO/MainProcess] Task my_app.tasks.periodic_update_task_statistic[175c2557-7c07-43c3-ac70-f4e115344134] succeeded in 0.00816309102811s: 'ok!'
[2014-12-16 20:56:11,157: INFO/MainProcess] Detected modified modules: ['my_app.tasks']
[2014-12-16 20:57:00,001: INFO/Beat] Scheduler: Sending due task my_app.tasks.periodic_update_task_statistic (my_app.tasks.periodic_update_task_statistic)
[2014-12-16 20:57:00,007: INFO/MainProcess] Received task: my_app.tasks.periodic_update_task_statistic[f22998a9-dcb4-4c29-8086-86dd6e57eae1]
So, my question: how to get celery update and apply new tasks code, if they were modified?
I have this same problem. While I don't like it, I do the following, which first removes and compiled .pyc files anywhere under my current directory, then restarts all the workers.
find . -name "*.pyc" -exec rm {} \; supervisorctl restart all
It seems strange that the--autoreload
flag does nothing, but it doesn't in my case.