supabase / supabase

The open source Firebase alternative. Supabase gives you a dedicated Postgres database to build your web, mobile, and AI applications.
https://supabase.com
Apache License 2.0
73.57k stars 7.1k forks source link

Unable to Upload large zip file of zip files over 5gb on production #27959

Closed j-mendez closed 2 months ago

j-mendez commented 4 months ago

Bug Report

If you try to upload a zip file using the local supabase instance it works fine. When using the hosted service it does not and the request spins forever. How can we get the GUI to upload large zip files?

All of the configurations to allow the large upload settings are enabled ( the 2 settings for file upload limits - global and bucket ).

Hallidayo commented 4 months ago

Hi @j-mendez - Thank you for opening this one. A couple of questions:

j-mendez commented 4 months ago

Hi @j-mendez - Thank you for opening this one. A couple of questions:

  • When you try the upload do you get any errors?
  • What's the upload setting you have here?

Hi, no problem. Using the UI I do not, with the CLI I get a 502 ( cloudflare ). I have a 6gb upload limit. The internet is fiber and the server was scaled to a really large size to see if it was an issue with resources.

With the 502 or 520 issue from cloudflare the uploads maybe need to be adjusted from the service end so the upload limit can be handled. The upload locally took about 15 seconds on my computer when using the local instance.

Let me know if you need any more details.

j-mendez commented 4 months ago

Hi @j-mendez - Thank you for opening this one. A couple of questions:

  • When you try the upload do you get any errors?

  • What's the upload setting you have here?

Hi, no problem. Using the UI I do not, with the CLI I get a 502 ( cloudflare ).

I have a 6gb upload limit. The internet is fiber and the server was scaled to a really large size to see if it was an issue with resources.

With the 502 issue from cloudflare the uploads maybe need to be adjusted from the service end so the upload limit can be handled. The upload locally took about 15 seconds on my computer when using the local instance.

Let me know if you need any more details.

Email support replicate the issue. Not sure if they were going to send the bug here or contact the devs. If they did excuse the duplicate ticket etc.

Hallidayo commented 4 months ago

Hi @j-mendez - Ahh cool no problem, We'll keep it open but could you update when support come back to you?

j-mendez commented 4 months ago

Hi @j-mendez - Ahh cool no problem, We'll keep it open but could you update when support come back to you?

Yes, will do.

j-mendez commented 3 months ago

Hi @j-mendez - Ahh cool no problem, We'll keep it open but could you update when support come back to you?

Yes, will do.

The storage aspect seems to be very flakey. I tried writing a script to recursively list and delete files and it works until it does too much. Theres like a limit of amount of throughput and after 1gb it seems to slow down or fail.

At what point do we just skip using supabase storage and just grab the S3 key directly? If I have to go through CloudFlare to get rate limited each time I guess this is the only workaround?

alaister commented 3 months ago

Hi @j-mendez,

Regarding uploading large files through the dashboard specifically: We're now using the TUS protocol in our storage explorer, which should make uploading large files reliable.

Please give it another go and let me know if you're still running into issues!

j-mendez commented 3 months ago

Hi @j-mendez,

Regarding uploading large files through the dashboard specifically: We're now using the TUS protocol in our storage explorer, which should make uploading large files reliable.

Please give it another go and let me know if you're still running into issues!

Hi yes, still getting issues.

https://github.com/user-attachments/assets/9595e6df-f628-4d87-a959-afedea8f45bd

j-mendez commented 3 months ago

Hi @j-mendez, Regarding uploading large files through the dashboard specifically: We're now using the TUS protocol in our storage explorer, which should make uploading large files reliable. Please give it another go and let me know if you're still running into issues!

Hi yes, still getting issues.

Screen.Recording.2024-07-17.at.6.35.28.AM.mov

Theres only 2 files in the folder amazon.com and it took like 20 seconds for the delete to process and fail. The delete experience it would be nice to be able to clear a whole folder at once. Ive spent the past 14 hours cleaning up data from the dashboard with extreme flakes and just unclearness on what is going on. You are using Cloudflare for the Load-balancing which causes major issues on uploading. Why not just use Cloudflare R2, instead of adding a Cloudflare layer to make S3 unusable for any real workload that is beyond a couple kilo - megabytes?

j-mendez commented 3 months ago

Hi @j-mendez, Regarding uploading large files through the dashboard specifically: We're now using the TUS protocol in our storage explorer, which should make uploading large files reliable. Please give it another go and let me know if you're still running into issues!

Hi yes, still getting issues. Screen.Recording.2024-07-17.at.6.35.28.AM.mov

Theres only 2 files in the folder amazon.com and it took like 20 seconds for the delete to process and fail. The delete experience it would be nice to be able to clear a whole folder at once. Ive spent the past 14 hours cleaning up data from the dashboard with extreme flakes and just unclearness on what is going on. You are using Cloudflare for the Load-balancing which causes major issues on uploading. Why not just use Cloudflare R2, instead of adding a Cloudflare layer to make S3 unusable for any real workload that is beyond a couple kilo - megabytes?

I realized this ticket is for uploading and not deleting records. Made another ticket for that as well. I will test this out.

Hallidayo commented 2 months ago

Hi @j-mendez - Just checking in here, Are you still experiencing the error with uploading?