dribble
December 18, 2019, 7:18pm
1
The export to csv/xls works fine unless I am exporting a very large file which produces a 502 error. I am assuming that it is just timing out. Is there an env variable that I need to change? I can’t find anything on this forum about this issue.
Thanks.
It also happens to me when I tried to export very large file to XLS. But it always works fine if I export to CSV. I am using hosted Redash.
dribble
December 19, 2019, 6:04am
3
My guess is that we have the same issue just that your csv file that is successfully downloading is small enough to download.
Hi, @dribble
Did you check Gunicorn’s configurations?
I guess it’s one of the reason why you stuck by 502 error.
Here is the my blog entry.
(But it’s written in Japanese but I guess you can understand with Google Translate
http://ariarijp.hatenablog.com/entry/2019/06/07/234851
2 Likes
Rob
March 12, 2020, 1:04pm
6
Thank you so much @ariarijp !
This solved our 502 issue regarding large xlsx files!
1 Like
Changing the environment variables in docker-compose.yml file can only change the limit up to 60 seconds. It is also limited by nginx. So if you want to export a file that take more than 60 seconds to generate, you also need to rebuild the nginx image (change the proxy_read_timeout
parameter)
upstream redash {
server redash:5000;
}
server {
listen 80 default;
gzip on;
gzip_types *;
gzip_proxied any;
location / {
proxy_read_timeout 300;
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $http_x_forwarded_proto;
proxy_pass http://redash;
}
}
opened 02:36PM - 02 Oct 21 UTC
closed 03:08PM - 02 Oct 21 UTC
### Issue Summary
I have created a query that returned about 320M rows. The q… uery successfully executed, and showed the correct result on the page. But when I want to export the result to a csv file. Each time I tried, it failed after exactly 30 seconds, and showed 502 Bad GateWay in the Chrome Dev Tools. Any way to improve exporting large csv file or get around?
### Steps to Reproduce
1. Create a query that returns many rows, and execute it.
![image](https://user-images.githubusercontent.com/8188177/135721108-55204f67-a9fe-4cad-997c-a0396acd38a9.png)
2. Export the file to csv Format.
![image](https://user-images.githubusercontent.com/8188177/135721065-fdc4abef-8884-4ff2-99bf-44e1769cd14a.png)
3. Here are the logs in the nginx and server image.
```bash
2021/10/02 13:39:50 [error] 31#31: *1 upstream prematurely closed connection while reading response header from upstream, client: XXX.XXX.XXX.XXX, server: , request: "GET /api/queries/XXX/results/XXXX961.csv HTTP/1.1", upstream: "http://XXX.XXX.XXX.XXX:5000/api/queries/XXX/results/XXXX961.csv", host: "XXX.XXX.XXX.XXX"
XXX.XXX.XXX.XXX - - [02/Oct/2021:13:39:50 +0000] "GET /api/queries/XXX/results/XXXX961.csv HTTP/1.1" 502 559 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.61 Safari/537.36" "-"
```
```bash
[2021-10-02 13:39:50,388][PID:243][INFO][metrics] method=GET path=/api/queries/XXX/results/XXXX961.csv endpoint=query_result status=500 content_type=? content_length=-1 duration=30300.16 query_count=10 query_duration=1415.56
[2021-10-02 13:39:50 +0000] [243] [INFO] Worker exiting (pid: 243)
```
### Technical details:
* Redash Version: V10 (b50363)
* Browser/OS: Chrome 94
* How did you install Redash: Docker