nuwave / lighthouse

A framework for serving GraphQL from Laravel
https://lighthouse-php.com
MIT License
3.35k stars 438 forks source link

Point based rate limiting #1745

Open stayallive opened 3 years ago

stayallive commented 3 years ago

What problem does this feature proposal attempt to solve?

A nice and clear point based rate limiting which is more fair than a throttle based (can only retrieve node x once every x window) because it takes in account the amount of (nested) queries it needs to execute, read on for more details.

Which possible solutions should be considered?

I have found that GitHub has a very nice idea on how they implement rate limiting at least for queries and I will not try to explain it better but redirect to their very nice documentation about how calculation is handled.

https://docs.github.com/en/graphql/overview/resource-limitations

This is the schema they are using to query rate limit information for the user and we could consider to inject in the schema if the rate limiting features are enabled:

    type Query {
        "The client\'s rate limit information."
        rateLimit(
            "If true, calculate the cost for the query without evaluating it."
            dryRun: Boolean = false
        ): RateLimit
    }

    "Represents the client\'s rate limit."
    type RateLimit {
      "The point cost for the current query counting against the rate limit."
      cost: Int!

      "The maximum number of points the client is permitted to consume in a 60 minute window."
      limit: Int!

      "The maximum number of nodes this query may return."
      nodeCount: Int!

      "The number of points remaining in the current rate limit window."
      remaining: Int!

      "The time at which the current rate limit window resets in UTC epoch seconds."
      resetAt: DateTime!

      "The number of points used in the current rate limit window."
      used: Int!
    }

The nice thing is that I believe the current complexity system can be re-used for a large part or maybe that is already enough information. It already calculates complexity based on amount of objects requested this value might be all we need for the cost of a query (we can still divide by 100 like GitHub does to keep the numbers smaller and more rounded).

I also believe we can heavily lean on the Laravel rate limiting features or at least re-use a large part of it's backbone to actually implement the limiting.

Features to consider:

Things to research


I did try to make a little start on this but I have not enough experience or time at the moment and I believe there is also not any feature in Lighthouse that injects a query in the schema with it's internal resolver so I got stuck there already... happy to discuss ideas and see if we can make some headway on this if there is interest to have this in the core.

spawnia commented 3 years ago

Great writeup!

In terms of implementation, we can probably use a validation rule for it that mirrors the built-in rules of webonyx/graphql-php. A good source of inspiration could be https://github.com/webonyx/graphql-php/blob/master/src/Validator/Rules/QueryComplexity.php

stayallive commented 3 years ago

I just found another approach which is equally as interesting. I wanted to add it to the discussion for future reference because it is possibly simpeler than the GitHub model to implement.

Possibly a mix between the two is the "right" way for Lighthouse.

https://shopify.engineering/rate-limiting-graphql-apis-calculating-query-complexity

I think this is simpeler because there is already query complexity calculations performed by the underlying GraphQL lib so that might be a value that can be used 1:1.