Hello,
I’m trying to integrate Redash with Databricks.
One of the things that are very important for me is databases access limitation (for example, people in the “accounts” user group in redash would have access only to the accounts database in Databricks)
Unfortunately, it doesn’t matter how I configure the Databricks data source on Redash, people who have access to Databricks have access to all its databases.
How can I fix this problem?
The attached images show the redash data source configuration, the tables the user has access to on Redash and the tables in Databricks.


databricks_tables
image

Regardless what database you connect with, Redash doesn’t limit which parts of the connected schema may be queried. The idea is to use your databases’s permissions controls instead.

An example for a way to do this in OSS Redash is as follows:

  • Create multiple access tokens for Databricks. Grant each token specific access rights on the cluster. Perhaps one token has access to the accounts database while another can only access customers or similar.
  • Create one Redash data source for each token. If you have three tokens there will be three data sources in Redash. They can all connect to the same endpoint, but they’ll use different tokens.
  • Use Redash Group membership to control which Redash groups can query which data sources
  • Add users to the appropriate groups based on their required permissions level.

FWIW this is a strong use-case to move from OSS Redash to Databricks SQL, which includes a heavily customised version of Redash that is directly tied-in with Databrick’s authentication APIs (and a lot of other improvements that aren’t possible on the OSS version).

1 Like

Thank you for the response.
Maybe it is because I’m new to Databricks, but I don’t see any way to create token with specific database permissions. When I’m generating a token it’s automatically has permission to all of the databases.
How can I create a token with specific access rights?
I’m using the standard version of Databricks on gcp (from what I understand Databricks SQL is not supported by google cloud)
thank you,

DBSQL is coming to GCP, thankfully :smile:

I’d speak with someone at databricks about how to obtain such a token. Or search their documentation.

1 Like

hey, shahar did you solve the above issue you mentioned. If yes, could you please give insights how to solve.

Thank in advance.