Question:
I’m looking for a straight-forward way to run Celery on an Elastic Beanstalk environment. Does this exist, or do I need to use SQS instead?
I have tried putting a line in the the .config file without good results. This is my .config file:
1 2 3 4 5 6 7 8 9 10 11 |
container_commands: 01_syncdb: command: "django-admin.py syncdb --noinput" leader_only: true 02_collectstatic: command: "./manage.py collectstatic --noinput" 03_migrate: command: "./manage.py migrate --noinput" 04_start_celery: command: "./manage.py celery worker &" |
When I ssh to the EC2 server and run ps -ef | grep celery
it shows that Celery isn’t running.
Any help appreciated. Thanks!
Answer:
Celery doesn’t show up because the container commands are run prior to reboot of the webserver during deployment. Basically, your celery workers get wiped out after the machine restarts.
I would suggest starting celery by using post deployment hooks.
See http://junkheap.net/blog/2013/05/20/elastic-beanstalk-post-deployment-scripts/ and How do you run a worker with AWS Elastic Beanstalk?