n1ru4l / envelop

Envelop is a lightweight library allowing developers to easily develop, share, collaborate and extend their GraphQL execution layer. Envelop is the missing GraphQL plugin system.
https://envelop.dev
MIT License
771 stars 119 forks source link

[useResponseCache] cached Async-Iterable result breaks clients #2236

Open santino opened 2 months ago

santino commented 2 months ago

Issue workflow progress

Progress of the issue based on the Contributor Workflow


Intro

Description Caching Async Iterable is extremely useful, but GraphQL clients expect that queries using @defer and @stream are executed in streaming mode. Instead, the implementation of this plugin delivers the whole cached result within a single response.

This entirely breaks Relay because it doesn't look within the initial response for any fragment that is marked as @defer but instead continue waiting for more parts to be delivered with the deferred fragments.

Additional information on Relay behaviour are available in this GitHub issue.

Workaround A workaround is to add an extension to the response to let clients know that the operation is complete. In case of Relay this must be extensions: { is_final: true }. Although the casing looks weird, so it might be possible that other clients expect isFinal instead.

This workaround can be implemented just by adding the following line of code before line 615:

result.extensions = { ...extensions, is_final: true };

With this extension Relay does process all the fragments available in the response even if this is not delivered in streaming mode. I hope you will consider accepting this fix quickly since this is my last blocker for migrating from Helix to Yoga.

Proper solution Even with the workaround above, Relay still issues the following warning

Warning: RelayModernEnvironment: Operation `AccountEventsNewPage_Query` contains @defer/@stream directives but was executed in non-streaming mode. See https://fburl.com/relay-incremental-delivery-non-streaming-warning.

This tells me that GraphQL clients expect a stream mode. With this in mind, the only proper way to solve this problem is to cache the response in parts. Then deliver the first part straight away and yield results for the following parts sequentially.

This will take longer to implement, so the workaround is still valid to avoid breaking clients.

santino commented 2 months ago

Spent more time on this. The workaround I proposed is the only thing that really works. To summarise it just consists in adding the following line of code before line 615 in the plugin:

result.extensions = { ...extensions, is_final: true };

I also tried adding an additional plugin to inject the required is_final property, like below

export const useAsyncIteratorIsFinal = () => {
  return {
    onResultProcess ({ result, setResult }) {
      if (
        result[Symbol.for('servedFromResponseCache')] &&
        !Array.isArray(result) &&
        !isAsyncIterable(result) &&
        !Object.hasOwn(result, 'hasNext')
      ) {
        result.extensions = {
          ...result.extensions,
          is_final: true
        }
        setResult(result)
      }
    }
  }
}

This doesn't work because it still affects standard queries that are not meant to be executed in stream mode (hence not using @defer / @stream. This is_final property should be added only to those cached results that are actually async-iterators; hence why the fix placed before line 615 will work, as it addresses only those cases.

Finally, I can address this directly within Relay's Network Layer, just by changing the behaviour for the next function on the Observable; like below:

next: data => data?.incremental
  ? data?.incremental.forEach(part => sink.next(part))
  : sink.next(Object.hasOwn(data, 'hasNext') ? data : { ...data, extensions: { ...data.extensions, is_final: true } }),

This seems to work, but I need to spend more time for additional tests.

It would probably unblock me, however, I believe that the plugin should provide the solution from a server side POV. Adding information for an is_final property seems like a good idea. It cannot produce weird behaviours, but instead it might provide useful information to GraphQL clients (like Rleay). Potentially you can also make the key configurable, so one can easily choose is_final or isFinal or something else.

EmrysMyrddin commented 1 month ago

Thank you for the report!

@ardatan What do you think ? I don't like the idea of adding client specific things like this, but given the very low impact, I think we can add it ? And make it configurable as suggested ?