Importing queries & visualisations

My company would like to export queries and visualisations from Redash instance A and import them into Redash instance B. Basically just shifting all queries and visualisation from one account to another account.

I have managed to export all queries into sql files on my local desktop. But how do I import them back into another account? Is it possible to bring visualisations and the dashboard over too?
The script given in GitHub didn’t work.

Extra qn: I used the given python export script and it exports all of my 100 plus queries. How do I limit and choose only those that I would like to export, lets say by favourites or by query number or title.

Thanks!

Here’s the code: It’s just the import part is not doing anything.
I believe the problem lies in upload_queries and get_query_content

import click
import requests
import os
import json

template = u"""/*
Name: {name}
Data source: {data_source}
Created By: {created_by}
Last Update At: {last_updated_at}
*/
{query}"""

def get_queries(url, api):
    queries = []
    headers = {'Authorization': 'Key {}'.format(api)}
    path = '{}/api/queries/my'.format(url)
    more_page = True
    page = 1

    while more_page:
        response = requests.get(path, headers=headers, params={'page': page}).json()
        queries.extend(response['results'])
        more_page = page * response['page_size'] + 1 <= response['count']
        page += 1

    return queries

def save_queries(queries):
    for query in queries:
        filename = 'query_{}.sql'.format(query['id'])
        # w for writing
        with open(filename, 'w') as f:
            # Header for sql file
            content = template.format(name=query['name'],
                       data_source=query['data_source_id'],
                       created_by=query['user']['name'],
                       last_updated_at=query['updated_at'],
                       query=query['query'])
            #written for 8 bits
            f.write(content.encode('utf-8'))


def upload_queries(url, api):
    headers ={'Authorization': 'Key {}'.format(api), 'Content-Type': 'application/json'}
    # Retrieves all files in directory
    files = [f for f in os.listdir('.') if os.path.isfile(f)]

    for f in files:
        if f.startswith('query_') and f.endswith('.sql'):
            start = f.index('_') + 1
            end = f.index('.')
            query_id = f[start:end]
            path = "{}/api/queries/{}".format(url, query_id)
            query_content = get_query_content(f)
            query_info = {'query': query_content, 'id': int(query_id)}
            response = requests.post(path, headers = headers, data = json.dumps(query_info))

def get_query_content(filename):
    query = ''
    #Reading contents in file
    with open(filename, 'r') as f:
        #Reads until EOF
        lines = f.readlines()
        for i in range(7, len(lines)):
            query += lines[i]
    return query

@click.command()
@click.option('--choice', help = 'Export or Import')
@click.option('--export-url', help = 'Redash Export URL')
@click.option('--export-api-key', help = 'Export User API Key')
@click.option('--import-url', help = 'Redash Import URL')
@click.option('--import-api-key', help = 'Import User API Key')

def main(choice, export_url, export_api_key, import_url,import_api_key):
    choice = click.prompt('Export[1]/Import[2]')
    if choice == '1':
        #Exporting
        export_url = click.prompt('Paste Export Redash URL')
        export_api_key = click.prompt('Paste Export User API Key')
        export_queries = get_queries(export_url, export_api_key)
        save_queries(export_queries)

    if choice == '2':
        #Importing
        import_url = click.prompt('Paste Import Redash URL')
        import_api_key = click.prompt('Paste Import User API Key')
        upload_queries(import_url, import_api_key)
        
if __name__ == '__main__':
    main()

    

I’m getting response error 404. What i’m guessing is that there has to be a query with the same query number in order to import? I am testing with 70 queries and the new account has 0 queries created.

Is it because it is only possible to import back to the same account you exported from?
Is it possible to import to a whole new account?

Hi @nicholas, I’ve been playing around with the API a bit and wondered if you had any success with this. I’d like to help if I can.

I’m trying to upload an sql file that i download with the save_queries
I’m using the upload_queries with the same url and api_key from the save function.
I’m getting 200 ok but can’t see any new file in my queries page.

def upload_queries(url, api_key):
headers = {'Authorization': 'Key {}'.format(
    api_key), 'Content-Type': 'application/json'}
# Retrieves all files in directory
files = [f for f in os.listdir('./')]
# files = [f for f in os.listdir('./') if os.path.isfile(f)]

for f in files:
    if f.startswith('query_') and f.endswith('.sql'):
        print(f)
        start = f.index('_') + 1
        end = f.index('.')
        query_id = f[start:end]
        path = "{}/api/queries/{}".format(url, query_id)
        query_content = get_query_content(f)
        query_info = {'query': query_content, 'id': int(query_id)}
        print(query_info)
        response = requests.post(
            path, headers=headers, data=json.dumps(query_info))
        print(response)


def get_query_content(filename):
    query = ''
    #Reading contents in file
    with open("./"+filename, 'r') as f:
        #Reads until EOF
        lines = f.readlines()
        for i in range(7, len(lines)):
            query += lines[i]
    return query
1 Like

Thanks for your response @shay. What platform are you running this on? Windows?

I’m running on Ubuntu using python 2.7