Closed hawkstein closed 3 years ago
I've checked the GraphQL/fragments generated by gatsby-graphql-toolkit
and run it in the Craft GraphiQL playground, it performs as expected: returning the correct amount of entries, with no duplicates and all with correct ids/data.
Sounds like something @monachilada ran into a while back, but he ended up losing some nodes altogether, because nodes were being lost between paging that the gatsby-graphql-toolkit
does behind the scenes.
Can you try setting the orderBy
parameter to "slug"
using the sourcing params? That's what worked for him.
That's correct, the issue could be traced all the way to the database level, in that the auto-incrementing indexes were all over the place for whatever reason and so the order or results would change seemingly randomly. Enforcing an orderBy
parameter indeed fixed it for me.
Thanks for the replies everyone. I adjusted my gatsby-config.js setup like so:
{
resolve: "gatsby-source-craft",
options: {
sourcingParams: {
reviews_reviews_Entry: {
orderBy: '"slug"',
},
},
},
},
(Quotes around the quotes caught me out at first) The correct amount comes in now. Thanks for you help!
Description
From a total of 5,059 entries, only 2993 are making it from Craft to Gatsby. I've tracked the issue down to duplicate nodes being created by
gatsby-graphql-source-toolkit
. So (to the sum of exactly the the number of missing entries) one to several remoteNodes with the same id/same fields are passed to the createNodes function, hence losing around 2000 out of my total. The entries have unique ids/data in Craft and I can't spot any rhyme nor reason why some are overwritten by duplicates.I'm still trying to figure out exactly why this is happening but the GraphiQL playground in the Craft Admin seems to return all the correct entries and the 'gatsby-source-craft` plugin attempts to create 5,069 nodes when building the Gatsby GraphQL layer, it's just overwriting 2000 of them.
Any guidance or help on my next steps to get to the root of the problem would be most appreciated. I'm currently investigating the fragments being built at the start of the process.
Additional info