使用环境变量在 AWS Elastic Beanstalk 上通过 django 运行 celery

2024-04-19

我想使用我的 Django 应用程序在 AWS Elastic Beanstalk 上运行 celery。我遵循了 @yellowcap 的这个很棒的答案(如何使用 AWS Elastic Beanstalk 运行工作线程? https://stackoverflow.com/questions/14761468/how-do-you-run-a-worker-with-aws-elastic-beanstalk)。所以我的supervisord.conf看起来像这样:

files:
"/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh":
mode: "000755"
owner: root
group: root
content: |
  #!/usr/bin/env bash

  # Get django environment variables
  celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g'`
  celeryenv=${celeryenv%?}

  # Create celery configuraiton script
  celeryconf="[program:celeryd]
  ; Set full path to celery program if using virtualenv
  command=/opt/python/run/venv/bin/celery worker -A myappname --loglevel=INFO

  directory=/opt/python/current/app
  user=nobody
  numprocs=1
  stdout_logfile=/var/log/celery-worker.log
  stderr_logfile=/var/log/celery-worker.log
  autostart=true
  autorestart=true
  startsecs=10

  ; Need to wait for currently executing tasks to finish at shutdown.
  ; Increase this if you have very long running tasks.
  stopwaitsecs = 600

  ; When resorting to send SIGKILL to the program to terminate it
  ; send SIGKILL to its whole process group instead,
  ; taking care of its children as well.
  killasgroup=true

  ; if rabbitmq is supervised, set its priority higher
  ; so it starts first
  priority=998

  environment=$celeryenv"

  # Create the celery supervisord conf script
  echo "$celeryconf" | tee /opt/python/etc/celery.conf

  # Add configuration script to supervisord conf (if not there already)
  if ! grep -Fxq "[include]" /opt/python/etc/supervisord.conf
      then
      echo "[include]" | tee -a /opt/python/etc/supervisord.conf
      echo "files: celery.conf" | tee -a /opt/python/etc/supervisord.conf
  fi

  # Reread the supervisord config
  supervisorctl -c /opt/python/etc/supervisord.conf reread

  # Update supervisord in cache without restarting all services
  supervisorctl -c /opt/python/etc/supervisord.conf update

  # Start/Restart celeryd through supervisord
  supervisorctl -c /opt/python/etc/supervisord.conf restart celeryd

他的代码运行良好,直到我决定将 settings.py 中的一些变量迁移到我的 Elastic Beanstalk 环境属性。

事实上,调用脚本时出现以下错误:

for \'environment\' is badly formatted'>: file: /usr/lib64/python2.7/xmlrpclib.py line: 800
celeryd: ERROR (no such process)

谢谢您的帮助。


这是由于 Supervisor 解析配置文件的方式所致 [1]。

您的环境设置包含未转义的 % 字符,可能来自 Django SECRET_KEY。

以下内容对我有用 - 尝试附加| sed 's/%/%%/g'到这里的管链:

celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g'`

结果行:

celeryenv=`cat /opt/python/current/env | tr '\n' ',' | sed 's/export //g' | sed 's/$PATH/%(ENV_PATH)s/g' | sed 's/$PYTHONPATH//g' | sed 's/$LD_LIBRARY_PATH//g' | sed 's/%/%%/g'`

[1] https://github.com/Supervisor/supervisor/issues/291 https://github.com/Supervisor/supervisor/issues/291

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

使用环境变量在 AWS Elastic Beanstalk 上通过 django 运行 celery 的相关文章

随机推荐