I needed to update a field on collection companies by filtering elements on group collection.
With the current DataMigration architecture, the target collection must be the one we want to update. But in this case, it proved counter-productive, the join was too big and the script crashed.
Context
I needed to update a field on collection
companies
by filtering elements ongroup
collection. With the currentDataMigration
architecture, the target collection must be the one we want to update. But in this case, it proved counter-productive, the join was too big and the script crashed.Sample script
```ts import { MongoBulkDataMigration } from "@360-l/mongo-bulk-data-migration"; import { DataMigrationProcess } from "@backend/utils/env/load/script/datamigration"; import { $nex } from "@backend/utils/mongo/utils"; import type { Company } from "@backend/utils/mongo/definitions"; const MIGRATION_ID = "migration_20240527T110000Z-activateMagicLinkForNonCustomizedUrlCompanies"; const processHandler = new DataMigrationProcess({ name: MIGRATION_ID }, buildMigration); void processHandler.handleMigration(); export function buildMigration() { return new MongoBulkDataMigrationTO DO
Sample script
```ts return new MongoBulkDataMigration