Bigquery key file decoding fails


#1

Hello,

I’m trying to set up Redash to run on Google Kubernetes Engine and I want it to automatically create a Bigquery data source when container starts for the first time.

I’m trying to set this up by sending CLI command to container but I’m facing a problem with the key file.

When running:
python manage.py ds new bq-connection --type bigquery --options projectID=redacted-project-id; jsonKeyFile=/app/secrets/credentials.json; loadSchema=True; useStandardSql=True; location=EU; totalMBytesProcessedLimit=200;

This will throw:
ValueError: No JSON object could be decoded

The key file is a valid credentials file created in GCP and given to the container through Kubernetes volume. It’s working fine in other places.

I did some digging in source code and noticed that Redash might expect key file to be base64 encoded (if I understood it correctly). I tried to encode credentials.json to base64, but after that Redash gives me incorrect padding error when testing the connection.

I’m starting to run out of ideas what might be wrong here.


#2

jsonKeyFile doesn’t take a path to the file, but its contents encoded in Base64.


#3

So that’s what it was. Thank you, got it working.