Guys,

In the below folder, lot of heavy files are saved (with size more than 1 GB) occupying total of 40GB,

/var/lib/postgresql/9.3/main/base/

Can you guys help me understand what is exactly getting saved in the above path

Thanks,
Rohan

When we further debug, there is table query_results in the Redash PostgreSQL which is occupying all the space

Any idea how to clean it ? what will be impact ?

There is a periodic method to clean unused query results. You should probably change settings to run it more frequently.

You would probably want to run it manually in a loop to cleanup the current backlog.

this is currently how settings look like in Redash, can you please help us what need to be changed for periodical cleanup exercise

@celery.task(name=“redash.tasks.cleanup_query_results”)
def cleanup_query_results():
“”"
Job to cleanup unused query results – such that no query links to them anymore, and older than
settings.QUERY_RESULTS_MAX_AGE (a week by default, so it’s less likely to be open in someone’s browser and be used).

Each time the job deletes only settings.QUERY_RESULTS_CLEANUP_COUNT (100 by default) query results so it won't choke
the database in case of many such results.
"""

logging.info("Running query results clean up (removing maximum of %d unused results, that are %d days old or more)",
             settings.QUERY_RESULTS_CLEANUP_COUNT, settings.QUERY_RESULTS_CLEANUP_MAX_AGE)

unused_query_results = models.QueryResult.unused(settings.QUERY_RESULTS_CLEANUP_MAX_AGE).limit(settings.QUERY_RESULTS_CLEANUP_COUNT)
deleted_count = models.QueryResult.query.filter(
    models.QueryResult.id.in_(unused_query_results.subquery())
).delete(synchronize_session=False)
models.db.session.commit()
logger.info("Deleted %d unused query results.", deleted_count)

Could you elaborate on where to run it, and how to run it manually ?

This will depend entirely on how you deployed Redash.

deployed via the newer one click - Container on AWS .