Open ttreder-explorance opened 2 months ago
Hi, thanks for the report. Our limit is 128 KiB (see docs), but for technical reasons it can be slightly lower than that in practice. Could you try the same with a limit of 122880
bytes (120 KiB) and see if you still get this error? I'll push to get a better estimate on what the practical limit is and update the docs if that works.
Hi @JanEbbing,
I temporarily reduced my payload limit to 125000 bytes and so far it has been working for me. I'll reduce it even more to 122880 just to be sure while waiting for your estimate.
Thank you for the quick follow-up!
Great that that fixed it - I'll try to get an estimate and update the docs, then close here.
guys please can the library handle all of such details? I have no desire to pay deep bucks for Deepl and then also start inventing some complicated logic that can stitch stuff into small packages because there are some limits and then also start rebuilding the responses one by one to get the full translated text, as a customer I want a simple API that takes either text (however big it is - if you need to internally split it into multiple queries for whatever reason - be my guest).
As a client, I have only one desire, use this package as simply as possible: deepl.translate(...) with any text.
Describe the bug When using the deep-node library to translate batches of text in syllabic languages (Japanese, Arabic, Chinese, etc.), despite making sure the payload stays under 130000 bytes, we still get 413 errors.
To Reproduce Steps to reproduce the behavior:
Expected behavior The coding/decoding of the texts should be consistent, avoiding any 413 errors.
Desktop:
Additional context Code example to reproduce:
Note: Using a decodeURIComponent on the server might be the cause of the issue as it does not behave the same way as URLSearchParams