fern-api / fern

Input OpenAPI. Output SDKs and Docs.
https://buildwithfern.com
Apache License 2.0
2.6k stars 139 forks source link

[Bug] CLI crashes with `RangeError: Invalid string length` when generating SDKs from large specs #4639

Closed mstade closed 2 weeks ago

mstade commented 2 weeks ago

Describe the bug

When trying to generate an SDK from a very large spec definition, the generator crashes with a RangeError: Invalid string length message when using --log-level=debug.

To reproduce

Steps to reproduce the behavior:

  1. Create a very large spec definition using thousands of models
  2. Generate an SDK of any language
  3. Verify that it fails with RangeError: Invalid string length after the step Migrated IR as reported by --log-level=debug

If you're not able to make it fail, your spec isn't large enough. I'm unfortunately not able to share our spec, due to confidentiality. I may be able to present you with the issue in a screen share, but will have to check with management.

Expected behavior

I expect the generation to work regardless of size of the spec.

Screenshots

n/a

CLI Version

0.41.12

Additional context

I believe this is caused by size limitations set by the node runtime, more context is available here: https://github.com/nodejs/node/issues/35973

The error is not very helpful, as this is really about not having enough memory to process such a large amount of data in one go. From reverse engineering the minified bundle, I believe the root cause of the error is here: https://github.com/fern-api/fern/blob/27fb37831c91a466bb75e952b8c3a886f2efeb64/packages/cli/generation/local-generation/local-workspace-runner/src/runGenerator.ts#L160

I was able to get past this step by implementing a dirty workaround in the minified bundle, by using Object.entries to stringify each individual top-level key's value, and then constructing a JSON string from that. Essentially I did this:

const stringified = `{ ${Object.entries(e).map(([key, value]) => `"${key}": ${JSON.stringify(value)}`).join(', ')} }`
await writeFile(absolutePathToIr, stringified); 

This does seem to have worked, but I fear it's a band-aid at best. At some point one of those root level key's value may turn out to be too big, causing the error again. A more solid fix would be to use something like a streaming JSON writer, if there is such a thing.

dsinghvi commented 2 weeks ago

Should be fixed in the latest version of the CLI, thanks @mstade for the contribution