What's the problems when i use sql with chinese


#1

hello all:
when i use hive datasouces(sparkthiftserver) with redash.my sql has chinese worlds,i get encode error
but it works fine on hiveserver2 databases,help me what’s the problems
worker_1 | [2018-11-15 07:06:59,656][PID:1][ERROR][MainProcess] Task redash.tasks.execute_query[b2d23c56-7799-4f8b-bac2-f18fda0b5ccf] raised unexpected: QueryExecutionError(u"‘ascii’ codec can’t encode charactersin position 36-41: ordinal not in range(128)",)
worker_1 | Traceback (most recent call last):
worker_1 | File “/usr/local/lib/python2.7/dist-packages/celery/app/trace.py”, line 240, in trace_task
worker_1 | R = retval = fun(*args, **kwargs)
worker_1 | File “/app/redash/worker.py”, line 71, in call
worker_1 | return TaskBase.call(self, *args, **kwargs)
worker_1 | File “/usr/local/lib/python2.7/dist-packages/celery/app/trace.py”, line 438, in protected_call
worker_1 | return self.run(*args, **kwargs)
worker_1 | File “/app/redash/tasks/queries.py”, line 528, in execute_query
worker_1 | scheduled_query).run()
worker_1 | File “/app/redash/tasks/queries.py”, line 470, in run
worker_1 | raise result
worker_1 | QueryExecutionError: ‘ascii’ codec can’t encode characters in position 36-41: ordinal not in range(128)


#2

Hi.

Can you tell us your Redash version and query?


#3

hello,my redash version is 5.0.2.
and query is :SELECT to_date(create_time) AS DAY,
CASE WHEN return_data LIKE ‘%message=数据获取成功%’ THEN ‘成功’ else ‘失败’ end ,
count(*) AS COUNT
FROM test
GROUP BY to_date(create_time), CASE WHEN return_data LIKE ‘%message=数据获取成功%’ THEN ‘成功’ else ‘失败’ end


#4

What about replacing ‘ with ’ ?


#5

It should not be a problem with this symbol. I can execute it by removing Chinese words.


#6

Sorry for late. I couldn’t reproduce the error.