dsherret / ts-morph

TypeScript Compiler API wrapper for static analysis and programmatic code changes.
https://ts-morph.com
MIT License
5k stars 195 forks source link

JavaScript heap out of memory when using findReferences() on many identifiers #642

Open nchanged opened 5 years ago

nchanged commented 5 years ago

Hi, So i've tried transforming the entire lodash-es library using ts-morph

After around 50 modules (there are ~900 there) JavaScript runs out of memory, not to mention that it's very very slow on my speedy Mac, (what would happen on a slower machine on a some CI in docker - I can only imagine)

Do you have any plans on optimising it, or any tips on making it faster? I must find another solution to it, cuz it's not acceptable even for production builds where speed doesn't matter, but I didn't expect it to be THAT slow)

UPD. the slowest operations, that makes it literally 50x slower is findReferences.... UPD: Found the culprit: I was doing findReferences on every single Identifier instead of selected items. That just killed it. I still have big hopes for this library xD

dsherret commented 5 years ago

@nchanged yeah, there's a lot of optimizations that can be done in the library especially when it comes to manipulations. You can read here about how to improve performance here: https://dsherret.github.io/ts-morph/manipulation/performance

Regarding #findReferences(), that method calls the compiler api and then wraps the returned object. There's probably a lot of stuff it constructs internally and then it eventually runs out of memory. I've often wondered if I could re-implement the findReferences functionality on my own because given the nature of this library I could probably make a few more optimzations than the compiler api is able to.

It would probably be a good thing for me to eventually go through and check why #findReferences() takes so long and uses so much memory. Perhaps there's a memory leak somewhere in the compiler api (or in my code, or something I could do to clean up memory). I'm going to reopen this issue for further investigation.