I am using your REST API read-only token in my web app. It is potentially used by thousands of untrusted clients concurrently.
They all use the same token.
There is a potential that one of these clients goes rogue, and starts hammering the API. This could lead to unexpected costs, and because the API is rate-limited on a DB level it could also impact service quality for the whole userbase.
I have experienced this before with another backend-as-a-service provider that used a similar auth setup. I had multiple instances of individual web clients going rogue and blasting the API, which impacted service quality and lead to unexpected costs.
One reason this could happen is e.g. a rogue chrome extension.
The feature request
One way to solve this could be to rate-limit/throttle/burst protect on a client level.
So each client could e.g. be throttled to x requests per second.
Previous conversation
https://discord.com/channels/807028371451936838/807028371451936841/917479264109076512
Context
I am using your REST API read-only token in my web app. It is potentially used by thousands of untrusted clients concurrently.
They all use the same token.
There is a potential that one of these clients goes rogue, and starts hammering the API. This could lead to unexpected costs, and because the API is rate-limited on a DB level it could also impact service quality for the whole userbase.
I have experienced this before with another backend-as-a-service provider that used a similar auth setup. I had multiple instances of individual web clients going rogue and blasting the API, which impacted service quality and lead to unexpected costs.
One reason this could happen is e.g. a rogue chrome extension.
The feature request
One way to solve this could be to rate-limit/throttle/burst protect on a client level.
So each client could e.g. be throttled to x requests per second.
Here is an example from AWS: https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-request-throttling.html (doesn't need to be that fancy of course ;))
Thanks for being open to feedback!
Yanick