forscht / ddrive

Discord as a filesystem.
https://ddrive.forscht.dev
MIT License
491 stars 98 forks source link

Files larger than 100MB are not supported. #49

Open idanyas opened 1 year ago

idanyas commented 1 year ago

I use a user token that has nitro and uploads files up to 500 MB. Uploads through Discord itself work:

But in ddrive when I try to upload a file with 100MB+ chunks, I get error.

when configured this: const chunkSize = 104857601

  http POST/file3 +3s
  discordFS >> [ADD] in progress : /file3 +3s
  error === Begin Error ===
  error ---
  error Error: Request entity too large
  error method : POST
  error url : /file3
  error Stack: DiscordAPIError[40005]: Request entity too large
  error     at SequentialHandler.runRequest (/root/.idanya/node_modules/@discordjs/rest/dist/index.js:667:15)
  error     at processTicksAndRejections (node:internal/process/task_queues:96:5)
  error     at async SequentialHandler.queueRequest (/root/.idanya/node_modules/@discordjs/rest/dist/index.js:464:14)
  error     at async REST.request (/root/.idanya/node_modules/@discordjs/rest/dist/index.js:910:22)
  error     at async DiscordFS.createFileChunk (/root/.idanya/node_modules/@forscht/ddrive/src/discordFS/index.js:265:25)
  error     at async StreamChunker.chunkProcessor (/root/.idanya/node_modules/@forscht/ddrive/src/discordFS/index.js:212:27)
  error ---
  error === End Error === +12s

At the same time, if I set the chunkSize to exactly 100MB, the files are uploaded successfully.

when configured this: const chunkSize = 104857600

  http POST/file3 +3s
  discordFS >> [ADD] in progress : /file3 +3s
  discordFS >> [ADD] completed   : /file3 +21s
forscht commented 1 year ago

Can you test with 495MB chunk size?

idanyas commented 1 year ago

Using const chunkSize = 519045120, I get:

at server:

  discordFS >> [ADD] in progress : /bigfile-2gb +7s
  error === Begin Error ===
  error ---
  error Error: 
  error method : POST
  error url : /bigfile-2gb
  error Stack: HTTPError
  error     at SequentialHandler.runRequest (/root/.idanya/node_modules/@discordjs/rest/dist/index.js:660:13)
  error     at processTicksAndRejections (node:internal/process/task_queues:96:5)
  error     at async SequentialHandler.queueRequest (/root/.idanya/node_modules/@discordjs/rest/dist/index.js:464:14)
  error     at async REST.request (/root/.idanya/node_modules/@discordjs/rest/dist/index.js:910:22)
  error     at async DiscordFS.createFileChunk (/root/.idanya/node_modules/@forscht/ddrive/src/discordFS/index.js:265:25)
  error     at async StreamChunker.chunkProcessor (/root/.idanya/node_modules/@forscht/ddrive/src/discordFS/index.js:212:27)
  error ---
  error === End Error === +0ms

at client:

root@client ~ # curl -X POST -T bigfile-2gb -u admin:password http://ip:port
Internal server error

The file size for uploading is exactly 2GB. I have a version of Node.JS v16.18.1 and the latest version from the repository

forscht commented 1 year ago

Try with chunkSize = 400000000

idanyas commented 1 year ago

The results are exactly the same... 😕

forscht commented 1 year ago

Maybe Discord is using different API for Nitro users, Can we connect over discord to discuss on this more?

UnlimitedBytes commented 1 year ago

The issue here is as follows: Currently DDrive uses Webhooks (https://discord.com/api/webhooks/{webhook-id}/{webhook-token}) to upload files. This route is as all routes from discord behind Cloudflare (https://cloudflare.com) a security service discord uses. Cloudflare forces a request limit of 100 MB which blocks every request to discord above 100MB content size.

The discord nitro upload uses a different approach then Bot/Webhooks Uploads by directly uploading to the google cloud bucket without tunneling through discord. So in order to support this DDrive needs an entire new Upload Backend which can switch between the 2 upload methods.

Here is a description of the nitro upload sequenze: You will need to send a POST request to https://discord.com/api/v9/channels/{channel-id}/attachments The request needs to be authorized by a Nitro User Token and contain an application/json payload containing all files requested to upload. An example of the json payload would be: [ { "filename": "{file-name}", "file_size": {file-size (in bytes)}, "id": "{id of the upload}" } ] You will then receive the information required for the upload onto the google cloud storage bucket looking something like this: [ { "id": {id of the upload}, "upload_url": "https://discord-attachments-uploads-prd.storage.googleapis.com/{file-token}?upload_id={access-token}", "upload_filename": "{file-name with path on google cloud}" } ] You then send a PUT request with the files content to the "upload_url".