twentyhq / twenty

Building a modern alternative to Salesforce, powered by the community.
https://twenty.com
GNU Affero General Public License v3.0
15.19k stars 1.54k forks source link

AA User I can run intensive mutations without getting an error #6025

Closed Weiko closed 3 days ago

Weiko commented 1 month ago

Scope & Context

Currently we have a limitation on the API - MUTATION_MAXIMUM_RECORD_AFFECTED, defined as an env variable - which specifies the number of records that can be affected during a GraphQL mutation (most specifically update and deletions mutations, creations are not concerned for some reason). This limitation comes from pg_graphql limitation, see "atMost" in the documentation here https://supabase.github.io/pg_graphql/api. If it's not specified, it falls back to the default value (60). This is also a good practice to avoid overloading the server and in this case the DB. This also tells the user "We handle this much so you should expect an acceptable timeframe for your request, more of it might result in degraded performances/timeout."

Current behavior

For example, when we try to delete more than {MUTATION_MAXIMUM_RECORD_AFFECTED} records, we get an error, which is fine from the API point of view but the Frontend should handle this properly, this is even more an issue UX wise when people want to check the box at the top of the table which implies that it should select all records and mutation affects all of them (even those that are not loaded yet).

Expected behavior

We still want to keep a limitation on the API but we should handle intensive mutations on the client with some strategies:

Technical inputs

greptile-apps[bot] commented 1 month ago

Implementation Steps

  1. Expose MUTATION_MAXIMUM_RECORD_AFFECTED in getClientConfig endpoint:

    • File: api/src/config/config.service.ts
    • Add MUTATION_MAXIMUM_RECORD_AFFECTED to the configuration object returned by getClientConfig.
      getClientConfig() {
      return {
      ...
      MUTATION_MAXIMUM_RECORD_AFFECTED: process.env.MUTATION_MAXIMUM_RECORD_AFFECTED || 60,
      };
      }
  2. Modify frontend to handle batching of mutations:

    • File: frontend/src/hooks/useMutations.ts
    • Update the mutation logic to batch requests based on MUTATION_MAXIMUM_RECORD_AFFECTED.
      const batchMutations = async (records) => {
      const limit = config.MUTATION_MAXIMUM_RECORD_AFFECTED;
      for (let i = 0; i < records.length; i += limit) {
      const batch = records.slice(i, i + limit);
      await mutateBatch(batch); // Assume mutateBatch is a function that handles the mutation
      }
      };
  3. Handle UI updates for batching and async jobs:

    • File: frontend/src/components/MutationProgress.tsx
    • Implement a progress bar or pagination view to reflect the mutation progress.
      const MutationProgress = ({ total, completed }) => (
      <ProgressBar now={(completed / total) * 100} />
      );
    • Integrate MutationProgress component in the relevant UI parts where mutations are triggered.
  4. Handle async job initiation for intensive mutations:

    • File: frontend/src/hooks/useMutations.ts
    • Add logic to initiate an async job if the number of records exceeds a certain threshold.
      const handleIntensiveMutation = async (records) => {
      if (records.length > INTENSIVE_MUTATION_THRESHOLD) {
      await initiateAsyncJob(records); // Assume initiateAsyncJob is a function that handles async job initiation
      } else {
      await batchMutations(records);
      }
      };

References

Weiko commented 1 month ago

See https://github.com/twentyhq/twenty/issues/6023 as a "temporary" solution.

Bonapara commented 3 weeks ago

Is #6039 fixing this issue? If not, could you add some tags, @Weiko? :)

Weiko commented 3 weeks ago

Only part of it, this should go with #5169

Weiko commented 3 weeks ago

@lucasbordeau is working on it

lucasbordeau commented 3 days ago

Closed by https://github.com/twentyhq/twenty/pull/5320