Closed rodrigopavezi closed 2 weeks ago
The pull request introduces several modifications aimed at enhancing the performance and efficiency of the GraphQL client and associated queries. Key changes include the implementation of a custom fetch function with caching capabilities, adjustments to the refetching logic in hooks to prevent unnecessary polling, and the addition of caching directives to various GraphQL queries. These updates collectively aim to optimize data fetching and reduce network calls.
Files | Change Summary |
---|---|
src/lib/graphQlClient.ts |
Added a custom fetch function that incorporates caching with a revalidation period of 60 seconds. |
src/lib/hooks/use-latest-payments.tsx |
Modified refetchInterval to disable refetching if pollInterval is zero. |
src/lib/hooks/use-latest-requests.tsx |
Modified refetchInterval to disable refetching if pollInterval is zero. |
src/lib/queries/address-transactions.ts |
Added @cached directive to ADDRESS_TRANSACTIONS_QUERY to enable caching. |
src/lib/queries/channel.ts |
Added @cached directive to CHANNEL_QUERY to enable caching. |
src/lib/queries/request-payments.ts |
Added @cached directive to REQUEST_PAYMENTS_QUERY to enable caching. |
src/lib/hooks/use-latest-payments.tsx (1)
`43-43`: **LGTM!** The change to the `refetchInterval` option is a good optimization. It prevents unnecessary polling when the `pollInterval` is explicitly set to 0 by setting `refetchInterval` to `false`. This enhances the control flow and optimizes performance by avoiding unnecessary network calls.src/lib/hooks/use-latest-requests.tsx (1)
`48-48`: **LGTM!** The change to the `refetchInterval` option is a good optimization. It prevents unnecessary polling when the `pollInterval` is explicitly set to 0 by setting `refetchInterval` to `false`. This enhances the control flow and optimizes performance by avoiding unnecessary network calls.src/lib/queries/request-payments.ts (1)
`12-12`: **LGTM, but verify the impact of caching.** The addition of the `@cached` directive to the `RequestPaymentsQuery` query is a good strategic enhancement to optimize the performance of payment requests. Caching can reduce the number of times the query needs to be executed against the backend, leading to improved response times and reduced server load. To verify the impact of caching, consider the following: 1. Analyze the query execution logs or metrics to compare the number of times the query is executed with and without caching. Expect a reduction in the number of executions with caching enabled. 2. Measure the response times of the query with and without caching. Expect improved response times with caching, especially for repeated queries with the same parameters. 3. Monitor the server load and resource utilization before and after enabling caching. Expect a reduction in server load and resource consumption with caching, as it reduces the need for redundant query executions. 4. Validate that the cached results are consistent with the actual data. Ensure that the caching mechanism updates the cache when the underlying data changes to maintain data integrity. 5. Test the behavior of the application under different caching scenarios, such as cache hits, cache misses, and cache invalidation. Verify that the application handles these scenarios gracefully and provides a consistent user experience. By verifying the impact of caching through these steps, you can ensure that the addition of the `@cached` directive achieves the desired performance optimization without compromising data accuracy or application functionality.
Change:
Note: I could not add @cache tag to the payments queries because it causes the requests to return 502 intermittently. I have asked Hasura team on Discord to see what they say about it.
Summary by CodeRabbit
New Features
@cached
directive to multiple GraphQL queries, improving response times by enabling result caching.Bug Fixes