I’m having a bit of difficulty figuring out what’s the best way to have a Docker set up that allows me to integrate Redash into my Django app that uses a Postgres database already. I started making my app and added some data. I started off using what seems to be standard Docker way to make a Django app as of today: here

I’ve noticed that this guide uses version 3 of docker-compose yml files. However, I’m trying to make it work similarly to the guide on setting up a Redash instance with Django as in here

Any tips? Ideally I’d like Redash to connect to the same Postgres db I already have set up. I also have the code for my Django app as a bind-mount as in the Django+Docker tutorial above. Is this even possible?

So what you have difficulty with exactly? You either expose your Postgres port to the host and then create you data source using the host address, or you may want to use attachable overlay network to interconnect containers in different Docker compose projects.

I’m having issues figuring out how to write the docker-compose.yml file such that it works with Redash. I’ve attached a version of my docker-compose.yml so far. It seems my Django app is loading.
version: ‘3’

services:
  db:
    image: postgres
    ports:
     - "5432"
    environment:
      PYTHONUNBUFFERED: 0
      REDASH_LOG_LEVEL: "INFO"
      REDASH_REDIS_URL: "redis://redis:6379/0"
      REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"

  redis:
    image: redis:3.0-alpine
    restart: unless-stopped

  web:
    build: .
    command: python3 manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/code
    ports:
      - "8000:8000"
    depends_on:
      - db
      - redis

    environment:
      PYTHONUNBUFFERED: 0
      REDASH_LOG_LEVEL: "INFO"
      REDASH_REDIS_URL: "redis://redis:6379/0"
      REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"

However, the Redash tutorial for launching Redash on Docker also contains these lines but it’s in version 2 of the docker-compose.yml language

  server:
build: .
command: dev_server
depends_on:
  - db
  - redis
ports:
  - "5000:5000"
volumes:
  - ../redash_repo/redash:/app
environment:
  PYTHONUNBUFFERED: 0
  REDASH_LOG_LEVEL: "INFO"
  REDASH_REDIS_URL: "redis://redis:6379/0"
  REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres" 
worker:
    build: .
    command: scheduler
    volumes:
      - ../redash_repo/redash:/app
    depends_on:
      - server
    environment:
      PYTHONUNBUFFERED: 0
      REDASH_LOG_LEVEL: "INFO"
      REDASH_REDIS_URL: "redis://redis:6379/0"
      REDASH_DATABASE_URL: "postgresql://postgres@postgres/postgres"
      QUEUES: "queries,scheduled_queries,celery"
      WORKERS_COUNT: 2

I haven’t been successful in figuring out how to essentially have the same functionality as in the Redash tutorial but having Redash read from the Postgres db I already have set up. Additionally, I have the repo with all the Redash code in a directory above the working directory of my Django app, which is why I do something like “…/redash_repo/redash:/app”.

Could you explain exposing the Postgres port a bit more? That seems closer to a simpler and possible solution.

Thank you so much for a speedy response!

As of now, my Terminal shows the following error when combining the two docker-compose file fragments above. I haven’t been able to make sense of it though

ERROR: for rerates_django_app_worker_1  Cannot start service worker: OCI runtime create failed: container_linux.go:348: starting container process caused "exec: \"scheduler\": executable file not found in $PATH": unknown

ERROR: for worker  Cannot start service worker: OCI runtime create failed: container_linux.go:348: starting container process caused "exec: \"scheduler\": executable file not found in $PATH": unknown
ERROR: Encountered errors while bringing up the project.

Don’t mix your app compose file with the redash one. Just use docker-compose.production.yml that they provide to fire up redash instance, then use the separate docker-compose.yml for your app. You already have your Postgres port exposed in your app’s compose, so you should be able to connect to it from redash using your-hostname:5432. You might need to setup additional roles for Postgres, or at least set the password for postgres role as the default PostgreSQL docker image won’t allow you to connect from the outside with no pass.

Gotcha. I think I have a basic idea on how to do that. Seems like there are two parts here: 1) Configuring my Postgres service to be accessible from the IP pertaining to the Redash service. 2) Telling Redash to connect to the database at the port the Postgres is being hosted at.

Here’s my current working understanding on how to this is done.

  1. Configuring the Postgres service to be accessible to Redash means enabling remote connections to the Postgres service as well as exposing a certain port on it.

So I could do something like this for configuring my database service that my Django app uses.

version: '2'

services:
  db:
    image: postgres
    ports:
     - "5433:5432"
    command: "postgres -c config_file=etc/public_ip_ps.conf"
    restart: unless-stopped
    environment:
      POSTGRES_PASSWORD: helloworld

My understanding is that the ports definition above “5433:5432” is saying whenever the local machine connects to port 5433, it is communicating with the port 5432 in the Docker container pertaining to the Postgres database. Additionally, the POSTGRES_PASSWORD: helloworld tells the service to create the default user “postgres” with the password “helloworld” so that a specific user can login to this Postgres service from the outside.

Additionally, setting

command: “postgres -c config_file=etc/public_ip_ps.conf”

Is needed for configuring a custom Postgres config file. This file right now just has one line in it: listen_addresses = '*''
So that the Postgres service is configured to receive communication from any IP address. This file is in a directory that gets bind-mounted.

  1. I’m a bit lost on this part. Not sure how to tell Redash to connect to the Postgres service here exactly. The Redash docker-compose.yml file also seems to make it’s on database service, so I’m not sure how to tell Redash to connect to the Postgres service associated with my Django app.

For instance, an an older example of this seems to recommend to do this by using the links: definition. Another example from StackOverflow also recommends this. However, it seems that the links configuration is deprecated, at least according to the official Docker reference page. Here, Docker recommends using user-defined networks instead.

Hi,

I believe you don’t need to provide an additional config to postgres as docker image is already listening on all interfaces.

As for connecting, just use your host IP address when adding the data source in redash. You don’t need to add anything to redash compose file.

Ah wow, yes that all seems to make a lot more sense! I’m able to load up the Redash container and play with the web interface from localhost:5000. However, I’m having issues figuring the right configurations for adding in the Postgres datasource to Redash. I figured it would be as easy as getting the IP address of the container with the Postgres database my Django app is using. The container holding my database is rerates_django_app_db_1. This container is associated with both the bridge network and the rerates_django_app_default network. Both of these have the different IP addresses assocaited them when I do docker network inspect x on either network, which lists the containers associated with that network and their IP address. Neither of these IP addresses seem to work.

Never mind! Finally got it! Thank you :slight_smile:

I just had to do a docker inspect rerates_django_app_db_1 and used the “Gateway” IP address I saw. Worked with setting up my datasource from the Redash UI on localhost:5000. Thanks you so much for guiding me in the right place! Would have been lost a lot longer without the nudge!

1 Like