Question regarding large metadata tables

Hey! I’m an engineer on team that provides Redash as a service for over 100 customers! We’re appreciative of the open source model and have made some changes to fit our use cases. One such change is the retention of the unused query results in the query_results metadata table. We want to perform analytics on the queries that our customers are writing and identify most used tables, columns and least used tables, columns. As such, we’ve modified the delete in the maintenance.py file to update the ‘data’ column to an empty string (since the actual query results are irrelevant) instead of deleting the unused queries.

I’ve seen topics here that mentioned issues with large query_results tables but my question is that can a large metadata table affect performance of Redash for our customers? I realize there are inserts to that table every time a query is run by customers but I wanted to make sure that customers do not face any issues with this change.