diced / zipline

A ShareX/file upload server that is easy to use, packed with features, and with an easy setup!
https://zipline.diced.sh/
MIT License
1.47k stars 139 forks source link

Bug: Fails to upload to Cloudflare R2 S3-compatible Storage #553

Closed jontybrook closed 4 months ago

jontybrook commented 4 months ago

What happened?

This is more of an FYI than a bug report, as I believe the solution for this lies at Cloudflare's end.

When attempting to use Cloudflare R2 for storage, uploads on the front-end UI seem to hang and never complete. Inspecting the logs; I see the following:

S3Error
    at parseError (/zipline/node_modules/minio/dist/main/internal/xml-parser.ts:26:13)
    at Object.parseResponseError (/zipline/node_modules/minio/dist/main/internal/xml-parser.ts:75:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Client.makeRequestStreamAsync (/zipline/node_modules/minio/dist/main/internal/client.ts:629:19) {
  amzRequestid: undefined,
  amzId2: undefined,
  amzBucketRegion: undefined
}
2024-05-09 07:41:30,411 PM info  [server::response] POST /api/upload -> 500

I believe this is something to do with compatibility between the minio SDK and Cloudflare R2. See related post here: https://community.cloudflare.com/t/r2-multipart-uploads-not-working-with-minio-client/396536/17

Version

latest (ghcr.io/diced/zipline or ghcr.io/diced/zipline:latest)

What browser(s) are you seeing the problem on?

Chromium-based (Chrome, Edge, Brave, Opera, mobile chrome/chromium based, etc)

Zipline Logs

S3Error
    at parseError (/zipline/node_modules/minio/dist/main/internal/xml-parser.ts:26:13)
    at Object.parseResponseError (/zipline/node_modules/minio/dist/main/internal/xml-parser.ts:75:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at Client.makeRequestStreamAsync (/zipline/node_modules/minio/dist/main/internal/client.ts:629:19) {
  amzRequestid: undefined,
  amzId2: undefined,
  amzBucketRegion: undefined
}
2024-05-09 07:41:30,411 PM info  [server::response] POST /api/upload -> 500

Browser Logs

Failed to load resource: the server responded with a status of 500 ()
file:1 Uncaught SyntaxError: Unexpected token 'I', "Internal S"... is not valid JSON
    at JSON.parse (<anonymous>)
    at XMLHttpRequest.<anonymous> (file-97515cca25c3c76c.js:1:11944)

Additional Info

No response

wraithy commented 4 months ago

I use R2 on my instance of Zipline without any issues so it's more than likely an issue with your config. Can you provide your config so that I can look for potential issues? Be sure to remove any important credentials such as your core secret etc.

diced commented 4 months ago

Like the above said, the issue is probably due to a misconfiguration and it's also been 2 weeks with no response so I will be closing this as stale.

billyriantono commented 1 month ago

Hey, Sorry for reopen this issue. I got same error

image

Here is my environment :

      - DATASOURCE_S3_ENDPOINT=<r2_account_id>.r2.cloudflarestorage.com
      - DATASOURCE_S3_ACCESS_KEY_ID=<r2_token_id>
      - DATASOURCE_S3_SECRET_ACCESS_KEY=<r2_token>
      - DATASOURCE_S3_BUCKET=<bucket_name>
      - DATASOURCE_S3_PORT=443
      - DATASOURCE_S3_FORCE_S3_PATH=true
      - DATASOURCE_S3_REGION=us-east-1
      - DATASOURCE_TYPE=s3

Is it the correct one ?

Thank you

wraithy commented 1 month ago

Like the above said, the issue is probably due to a misconfiguration and it's also been 2 weeks with no response so I will be closing this as stale.

Set DATASOURCE_S3_REGION to ENAM and DATASOURCE_S3_FORCE_S3_PATH to false (this only needs to be true if you want to specify a folder in the bucket that the files get uploaded to). DATASOURCE_S3_PORT is not needed and I have listed all of the required variables below.

- DATASOURCE_TYPE=s3
- DATASOURCE_S3_ACCESS_KEY_ID=<ACCESS_KEY_ID>
- DATASOURCE_S3_SECRET_ACCESS_KEY=<ACCESS_KEY_SECRET>
- DATASOURCE_S3_BUCKET=<BUCKET_NAME>
- DATASOURCE_S3_ENDPOINT=<ACCOUNT_ID>.r2.cloudflarestorage.com
- DATASOURCE_S3_REGION=ENAM
- DATASOURCE_S3_FORCE_S3_PATH=false
- DATASOURCE_S3_USE_SSL=true

If you are still having issues after modifying your config with the changes mentioned above, feel free to reply here.

billyriantono commented 1 month ago

It's works smoothly.

Thanks @wraithy for the helps :D