Open reaperdtme opened 3 years ago
Thanks for opening this issue @aspectgfg!
Can I ask you, how large is the typedefs in:
How long does the booting take for you?
This is something that hasn't come up before.
I see some complexity with sharing a schema in that the GraphQL library and the OGM interprets the typedefs differently (@auth
and @private
for example).
Over 55k lines in GraphQL syntax Total Types after Neo4JSchema loaded: 95119 Idle Memory Usage: 5.21 GB
// load gql files
typedef-files: 13.495ms
// create ogm
new-ogm: 1:04.274 (m:ss.mmm)
// create schema
new-neo4j-graphql: 1:20.772 (m:ss.mmm)
I see some complexity with sharing a schema in that the GraphQL library and the OGM interprets the typedefs differently (@auth and @private for example).
That's a little odd. I'd expect if the directive has different behavior in each lib, they should be different directives.
@aspectgfg Thanks for the info! That seems really slow. I'll bring this to our internal tickets to see if we speed things up and/or only parse the typedefs once.
That's a little odd. I'd expect if the directive has different behavior in each lib, they should be different directives.
The OGM ignores the @auth
and the GraphQL lib skips fields with @private
on them.
@aspectgfg, whilst we could generate some super large type definitions to benchmark against ourselves, it would be much easier and much more realistic to use some real type definitions. If there's any way at all that you could give us your type definitions, then we could set up some benchmarking tests to help us improve performance and ensure that performance does not drop in the future. Fully understand if they are private for whatever reason!
@darrellwarde yeah here's our GQL Schema for the FHIR data model https://gist.github.com/aspectgfg/087612013e61eac4f7dffff1dc372a5e
This is FHIR: https://www.hl7.org/fhir/
@darrellwarde being able to precompile the graphql schema would be phenomenal, we hit 95000 types on the FHIR schema. it's pretty insane.
@aspectgfg @darrellwarde Precompiling the graphql schema could be a big improvement for serverless use. As this would greatly reduce cold start times.
With our small gql schema of around 400 lines it takes between 300 to 800ms just to build the ogm
and the augmented schema
Hey @aspectgfg and @mathix420, thanks for the feedback, I understand your pain on the performance impacts of generating the schema on every server startup.
Out of interest, what would you imagine schema pre-compilation would look like, especially in the context of resolvers? I imagine you would not expect a GraphQLSchema
, but have a CLI tool which generates type definitions and resolvers as files which you can then import and pass into a server instance?
Interested to hear your ideas!
Hi @darrellwarde, nice to see your interest on this!
I was thinking basically the same as you.
Maybe a plugin for @graphql-codegen/cli
like these.
I am currently using babel-plugin-graphql-tag
to compile my graphql schema to gain in performances, if you were able to do something similar with the neo4j schema it would be super neat!
To summarise a little, a CLI or a plugin for babel, rollup, or webpack which is capable of converting the neo4j graphql schema into AST. Even better if it can then allow to get OGM models typings in typescript projects (like mongoose typings).
Have there been any more thoughts on this? I'm seeing 6 second cold starts on AWS lambda now that I use the OGM in resolvers, about double the delay with just the autogenerated ones.
Same here, 6 second cold start that prevents auto-scaling (new instance spawning introducing 6s latency on the loading request).
Neo4j is a powerful tool, but this problem and the fact that it can take 500ms to a few seconds to establish a connection to neo4j means that we wiull likely move away from Neo4j.
This issue could also be addressed by fixing this one https://github.com/ardatan/graphql-tools/issues/4270
Executing makeExecutableSchema
at build time, would drastically increase the start time of this library.
Bun's Macros could be the easiest solution to this problem, but unfortunately right now Bun do not support compiling circular structures (https://github.com/oven-sh/bun/issues/5271).
Once this feature implemented the only thing we would need to do is import our function that generate the schema and ogm with the macro syntax.
neo4j.ts
import { Neo4jGraphQLConstructor } from "@neo4j/graphql";
import { OGM } from "@neo4j/graphql-ogm";
const neoSchema: Neo4jGraphQLConstructor = {
resolvers: {
...resolvers,
...scalars,
},
typeDefs: typeDefs,
driver,
};
const ogm = new OGM(neoSchema);
export async function getSchemaOgm(): Promise<[GraphQLSchema, void]> {
return Promise.all([neoSchema.getSchema(), ogm.init()]);
}
index.ts
import { getSchemaOgm } from "./neo4j" with { type: 'macro' };
Then run bun build index.ts
and that would be fixed!
On our server where we're using both OGM and Neo4JSchema. Our GQL schema is extremely large and Neo4jSchema/OGM are impacting boot times on our server. If we could have a single neo4jschema, our boot times would be reduced drastically. Please allow OGM constructor to accept Neo4JSchema
Our init: