Redash Migrations Issue (Docker v3 to v5)


I currently had a functional v3 build of Docker Redash.

To get the latest v5 I stopped all my docker instances and did

docker-compose run --rm server manage db upgrade

When I start the docker compose I get this stacktrace

worker_1  | [2018-09-27 03:15:14,806][PID:1][ERROR][MainProcess] Task redash.tasks.refresh_queries[d1658aea-325c-43f2-8bbc-55d57dc09760] raised unexpected: ProgrammingError('(psycopg2.ProgrammingError) column queries.search_vector does not exist\nLINE 1: ...ule_failures, queries.options AS queries_options,\n                                                             ^\n',)
worker_1  | Traceback (most recent call last):
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 240, in trace_task
worker_1  |     R = retval = fun(*args, **kwargs)
worker_1  |   File "/app/redash/", line 71, in __call__
worker_1  |     return TaskBase.__call__(self, *args, **kwargs)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/celery/app/", line 438, in __protected_call__
worker_1  |     return*args, **kwargs)
worker_1  |   File "/app/redash/tasks/", line 275, in refresh_queries
worker_1  |     for query in models.Query.outdated_queries():
worker_1  |   File "/app/redash/", line 1010, in outdated_queries
worker_1  |     for query in queries:
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/", line 2925, in __iter__
worker_1  |     return self._execute_and_instances(context)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/", line 2948, in _execute_and_instances
worker_1  |     result = conn.execute(querycontext.statement, self._params)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/", line 948, in execute
worker_1  |     return meth(self, multiparams, params)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/", line 269, in _execute_on_connection
worker_1  |     return connection._execute_clauseelement(self, multiparams, params)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/", line 1060, in _execute_clauseelement
worker_1  |     compiled_sql, distilled_params
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/", line 1200, in _execute_context
worker_1  |     context)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/", line 1413, in _handle_dbapi_exception
worker_1  |     exc_info
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/", line 203, in raise_from_cause
worker_1  |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/", line 1193, in _execute_context
worker_1  |     context)
worker_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/", line 507, in do_execute
worker_1  |     cursor.execute(statement, parameters)
worker_1  | ProgrammingError: (psycopg2.ProgrammingError) column queries.search_vector does not exist
worker_1  | LINE 1: ...ule_failures, queries.options AS queries_options,
worker_1  |                                                              ^
worker_1  |  [SQL: 'SELECT queries.query AS queries_query, queries.updated_at AS queries_updated_at, queries.created_at AS queries_created_at, AS queries_id, queries.version AS queries_version, queries.org_id AS queries_org_id, queries.data_source_id AS queries_data_source_id, queries.latest_query_data_id AS queries_latest_query_data_id, AS queries_name, queries.description AS queries_description, queries.query_hash AS queries_query_hash, queries.api_key AS queries_api_key, queries.user_id AS queries_user_id, queries.last_modified_by_id AS queries_last_modified_by_id, queries.is_archived AS queries_is_archived, queries.is_draft AS queries_is_draft, queries.schedule AS queries_schedule, queries.schedule_failures AS queries_schedule_failures, queries.options AS queries_options, queries.search_vector AS queries_search_vector, queries.tags AS queries_tags, AS query_results_1_id, query_results_1.retrieved_at AS query_results_1_retrieved_at \nFROM queries LEFT OUTER JOIN query_results AS query_results_1 ON = queries.latest_query_data_id \nWHERE queries.schedule IS NOT NULL ORDER BY'] (Background on this error at:
worker_1  | [2018-09-27 03:15:14,814][PID:1][INFO][MainProcess] Task redash.tasks.cleanup_tasks[5048af86-525d-4c1c-85af-18d87f7ed370] succeeded in 0.00293156299995s: None

The Queries Table


When I start a new Database instance from scratch (without migration) it seems to be working fine. But I need to migrate the database.



docker run -it redash/redash:latest sh


cat redash/

and I can see search_vector in the class Query

So I believe I have the Docker Image upgraded properly.


seems 3 > 5 migration has issues, here is my


Hi @parasiil and @srinathganesh ,

i just did the same upgrade one hour ago.
I had to run a two step upgrade : first upgrade to redash v4 with database upgrade and then migrate to v5 and upgrade the db.
And everything went fine.


thanks I’ll try that


How did you pulled v4 with upgrade? some specific --channel as argument?


Following this doc :
I changed the tag of the redash image in my docker-compose.yml, from latest to 4.0.0.b3168
and after all I changed the tag to from 4.0.0.b3168 to 5.0.0.b4754

And here are the available docker images tags :


yeah, that’s easy with docker. I am using direct install and have forgotten how to pull earlier versions


found it


v3 -> v4 -> v5 works


how did you upgrade from 3 to 4? I installed 3 with the provision script, with seperated psql database. When i run the bin/upgrade it wants to install 5 straight and failed.


Hope you did a backup!

I use Docker. so the docker-compose.yaml file has something like

    image: redash/redash:5.0.0.b4754


I try to upgrade on a snapshot of the db. So lots of room for failures.


also I had this problem -> I was messing with upgrade and the copy of database got corrupted somehow that upgrades failed. Then I restored database and then upgrade worked fine. see above logs