I have a Celery system running, with 4 queues, using RabbitMQ as the broker.
Currently this system is running using the gevent workerpool, and it is working fine when starting the worker from the command line like this:
celery -A app worker -Q celery -P gevent -c 100
Now I want to start this from python instead of from the command line. Reason being that I want to make queues, workerpool and concurrency configurable from Python.
When starting the celery workers using:
worker = app.Worker(
name=settings.APP + "_" + pool + "_" + str(concurrency) + "_@%h",
queues=queues,
pool=pool,
concurrency=concurrency,
loglevel="INFO",
task_events=True,
)
worker.start()
I get a normally running worker when using pool=prefork, but with pool=gevent the worker freezes after starting up. It looks like it receives some tasks, but does not start them. I suspect this has to do with celery using the gevent monkeypatching, which monkeypatches threads, but I don't know for sure, and also don't know how to disable that.
I can get gevent workers to run from python using subprocess.Popen(cli_args) instead of worker.start(), but since this will be running in Kubernetes in production, I want the worker to be the main process for health checks and the like.
Does anybody know how to start a Celery Worker process, with the gevent worker pool, directly from python?
EDIT: For now I am using subprocess.call(cli_args) to get a synchronous call to the worker process, but I still would like to know why worker.start() does not work.