Issue Summary

After setting up a local instance of Redash from the docker images, I’m having trouble pulling CSV files from the internal (development) servers here over HTTPS. The internal development servers use certificates signed by an internally generated CA. eg the certificates are valid internally, but don’t chain up to the root CA’s on the internet

After defining a new data source pointing to an internal server (which the “Test Connection” button says works), trying to create a query that uses it fails:

Error running query: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)

There doesn’t seem to be a way to tell whichever component of redash does the querying to ignore the SSL verify error (an option in the URL query definition window would be useful).

I guess what needs to be done instead, is installing the root certificate chain into the docker image(s) somehow, so the verification succeeds.

The question is… how?

Technical details:

  • Redash Version: 5.0.0+b4754
  • Browser/OS: Firefox 60.4.0esr (64-bit) / CentOS 7 x64
  • How did you install Redash: From Docker image

This is the URL query runner: https://github.com/getredash/redash/blob/master/redash/query_runner/url.py. And actually it currently doesn’t support CSV files. CSV files support will be implemented here: https://github.com/getredash/redash/pull/2478.

True. Currently the URL data source just takes the URL, it’s about time we upgrade it a bit … :slight_smile:

:man_shrugging: to be honest, not familiar enough with Docker to answer this. But there is nothing Redash specific about this, so I’m pretty sure there should be some guide/blog post about this somewhere :slight_smile:

Ahhh, no worries. I thought I was doing something wrong with the CSV weirdness too, as even after temporarily setting up a non-https server just to see what happens… the CSV didn’t seem to work.

Not sure why I thought CSV was supported. Maybe I saw that issue when half asleep and info overloaded, not realising it’s still WIP. :wink:

I’ll probably switch to trying with JSON then. Still getting the hang of how everything fits together, so there’ll probably be some further newbie mistakes and learning experiences on the way though. :slight_smile:

The goal I’m looking to accomplish atm, is adding support for DBHub.io to Redash. eg so when people upload their data to our servers via the GUI, anyone with a Redash instance can pull the data across and process/visualise/etc it as needed.

Starting out with JSON seems like a reasonable idea. Optimisation can come later. :wink:

Hmmm. While putting together a local development setup for Redash on my workstation, I stumbled over the currently public redash docs showing “Supported Data Sources”:

https://redash.io/help/data-sources/supported-data-sources

That shows CSV (from a URL) as supported, both for Hosted Redash and Self Hosted Redash.

Does that mean the docs are incorrect in this instance, or am I not quite understanding something? :slight_smile:

The docs are incorrect :pensive: Fixed though:

1 Like

Hi, I’ve just gone through this and was able to install self-signed certificates on docker containers. Please PM me if someone needs help with this (but please allow 10 days after this friday, since I’ll be traveling with my kids).

Bye

Arnaldo

That’s pretty awesome. Any interesting in writing up the steps somewhere (wiki)? :smile:

1 Like

I solved it for my self-hosted environment with the following steps:

  1. On the host linux, add your CA-certificate to /etc/ssl/certs/ca-certificates.crt It should look like this:
    -----BEGIN CERTIFICATE-----
    MIIEGDCCAwCgAwIBAgIBATANBgkqh...
    -----END CERTIFICATE-----
  1. Tell Python where to look for certificates. Add the following line to /opt/redash/env:

REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt

  1. Mount your host certificate store to the containers. Add the following lines to your docker-compose.yml for the services scheduled_worker and adhoc_worker
volumes:
  - /etc/ssl/certs/:/etc/ssl/certs/
  1. Restart your containers

docker-compose down
docker-compose up -d

The cool thing is: now you can just add certificates on your host system and the container instances are updated automatically.

3 Likes

Awesome, that should help heaps. :smile:

How would you get Python to look for multiple certificates? Till now, I have been appending all my self-signed certs into one file. Surely, there must be a better way?

In theory, it should read all of the certificates placed in that directory (/etc/ssl/certs) above.

Did you need to make any changes to your nginx config file to make this work?