2

Configure Docker Compose startup order for Django, REST Framework and Celery/Rab...

 2 years ago
source link: https://kenanbek.medium.com/configure-docker-compose-startup-order-for-django-rest-framework-and-celery-rabbitmq-redis-127f7a482626
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Configure Docker Compose startup order for Django, REST Framework and Celery/RabbitMQ/Redis application

Hello, World!

For one of my projects where I use Django, REST Framework and Celery with RabbitMQ and Redis I have Docker Compose configuration with 6 containers:

1. Postgres
2. Redis
3. RabbitMQ
4. Web (Python/Django)
5. Load Balancer (HAProxy)
6. Worker (Celery)

UPDATE: As an example you can refer to following GitHub project. It is base skeleton for Django and Celery projects with PostgreSQL as a backend, RabbitMQ message broker and Redis as a result storage: https://github.com/KenanBek/django-celery-skeleton

Initial configuration for all of these looks like following:

I had issue when I run

sudo docker up

With given configuration Web containers, usually, run before Postgres container and get connection error because database is not ready. You need to down containers and run again to get it fixed. Sometimes it takes few attempts finally up containers. It was not DevOps friendly solution.

As per Docker’s documentation:

However, Compose does not wait until a container is “ready” (whatever that means for your particular application) — only until it’s running. There’s a good reason for this.

The problem of waiting for a database (for example) to be ready is really just a subset of a much larger problem of distributed systems. In production, your database could become unavailable or move hosts at any time. Your application needs to be resilient to these types of failures.

After a little research I came up with following solution:

In this case I use custom bash script (wait-for-postgres.sh) which waits until postgres is ready and then proceed with entrypoint.sh and runs Django related commands.

Code for entrypoint.sh:

Code for wait-for-postgres.sh:

I have added comments so can follow what is going on. Buy a little explaination about this part:

...
until PGPASSWORD=$POSTGRES_PASSWORD psql -h "$host" -U "$POSTGRES_USER" -c '\q'; do
>&2 echo "Postgres is unavailable - sleeping"
sleep 1
done
...

It is taken from Docker’s official documentation. But the miss PGPASSWORD=$POSTGRES_PASSWORD settings which is required for psql to connection otherwise you need to enter it manually in interactive mode (which is also not DevOps friendly). And, as you probably understand to have POSTGRES_PASSWORD env setted in Web container we need to put it in Docker file:

UPDATE: I did pull request for a issue described here for Docker’s documentation and it is accepted by Docker’s team.

Related links:

Docker Compose Startup Order (official documentation with my addition)
Start containers automatically (official documentation)
Docker Compose wait for container X before starting Y (Stackoverflow)
Re-using environmental variables in docker-compose yml (Stackoverflow)


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK