Closed Edsol closed 1 month ago
I'll see what I can do about that, but since I'm already caching and reusing whatever is possible, it might be the case that there's nothing left to improve
to limit the load, could a configuration parameter to exclude or limit the depth of relationships be a temporary solution?
Yes, that's possible for the runtime No idea how to implement the types for that case though
Actually, I might be able to optimize something here. Just noticed an awfully big amount of inputs, I should be able to reduce that, expect a patch soon.
Can you try it with v0.7.3? That should reduce memory usage a lot. If that's not enough for it to become usable on large schemas again, then I'll add depth limiter.
memory goes up less quickly but after a couple of minutes of processing it still exceeds 20GB
Rise in memory usage was to be expected, but that's way too much. I see. I'll keep optimizing it as much as possible.
I've added depth limiter for now, hopefully I'll be able to do something about it in the future.
the depth limit works, it may be the right compromise on rather large databases with many relationships.
Thanks
I've added depth limiter for now, hopefully I'll be able to do something about it in the future.
Is it possible to configure the depth?
EDIT:
Sorry, i saw there is a depth config in
const { entities } = buildSchema(schema, {
relationsDepthLimit: 5
});
I am using the extension on some example databases (mysql and postgres) to develop my extension.
I have big memory problems on version 0.7.x of the extension
Large MySql database info (via Apollo):
Small MySql database info (via Apollo):
Large Postgres database info (via Apollo):
Small Postgres database info (via Apollo):
version 0.6.0
memory used:
small mysql schema: about 120MB RAM
mysql schema large: about 900MB RAM
small postgres schema: about 100MB RAM
large postgres schema: about 2.50GB of RAM
version 0.7.2
memory used:
mysql schema small: about 110MB RAM
mysql schema large (without relationships): about 700MB RAM
mysql schema large: over 20GB RAM (without finishing processing)
small postgres schema: about 100MB RAM
large postgres schema (without relationships): over 1.70GB RAM
large postgres schema: over 20GB RAM (without finishing processing)