Open DavidArchibald opened 9 months ago
Thanks for the report! I obviously couldn't test every single commit before publishing it 😄
Just to write down my thoughts, the unlink is a red herring; that path is to delete outputs if another error happens. I suspect that compiler error is the actual problem.
How to fix it... usually this means that the code didn't build at the specific time, even though I ensure that I use a lockfile or npm install --before
. So, not totally sure what the fix will be until I play with it.
I think I have the general gist of the problem figured out but I'd be unsure how to fix it.
Just to clarify, what do you have some other thoughts here? Anything helps, unless you aren't meaning that.
Well I peeked into the source code:
This is the line it errored on: https://github.com/microsoft/TypeScript/blob/3d8cf62846c8bda828675239ddd9d4bd33e53d51/src/server/typingsInstaller/nodeTypingsInstaller.ts#L125
This was its @types/node
version:
https://github.com/microsoft/TypeScript/blob/3d8cf62846c8bda828675239ddd9d4bd33e53d51/package.json#L47
Now maybe you have some more machinery in place than just trusting the package.json
but there's no package-lock.json
AND @types/node
has the pretty useless version of latest
.
Now maybe you have some more machinery in place than just trusting the
package.json
but there's nopackage-lock.json
AND@types/node
has the pretty useless version oflatest
.
I do, in that when there's no lockfile present, I grab the commit timestamp and then run npm install --before <timestamp>
, which asks npm
to install only packages that would have been available on that timestamp. But, broken commits do happen, so I can try and create a manual workaround.
Ahhh I see you mentioned npm install --before
already. My apologies, I hadn't looked it up before posting my reply so the purpose of the --before
flag went over my head.
Testing this specific commit isn't that important to me, you could always maintain a list of not working commits if you want? They'd of course have to be excluded from bisects but it'd be easier than making literally every single commit work. I only happened to stumble onto this commit because of bad luck, just happened to be picked during my bisect.
I would say one improvement is that I had to reinstall every-ts
because it seemed like every-ts reset
and every-ts switch main
wasn't enough to get it to work again; it'd keep trying to build 3d8cf62846c8bda828675239ddd9d4bd33e53d51
even when I did every-ts bisect reset
but maybe that's at least partially user error? I just hadn't expected these things to build the current version again so the easiest fix was to just reinstall.
Also I'd be willing to spin up a server to test every revision if you want. It could take several days to run to completion, I'd have to start the process of testing every revision before truly knowing an estimate, but I believe what I'd have to do would be git rev-list
+ every-ts switch $revision
so it'd be pretty cheap to run on the cloud and get an exhaustive list.
Testing this specific commit isn't that important to me, you could always maintain a list of not working commits if you want? They'd of course have to be excluded from bisects but it'd be easier than making literally every single commit work.
If one commit fails, it's likely that other commits within the same timespan also fail, so I would try and aim for a more general fix. But, if it comes to it, I can of course figure out which range of commits need a specific version of the node types and check that when building.
Also I'd be willing to spin up a server to test every revision if you want. It could take several days to run to completion, I'd have to start the process of testing every revision before truly knowing an estimate, but I believe what I'd have to do would be
git rev-list
+every-ts switch $revision
so it'd be pretty cheap to run on the cloud and get an exhaustive list.
That would be super interesting, but there are 35k+ commits on the repo... That's like 25 straight days of nonstop building assuming each takes a minute. I don't want to impose that cost on you, of course. Then again, the builds are mostly single threaded, so I bet one could install multiple instances of every-ts
and run one per core... Sounds fun 😄
I've started processing the commits. This'll take several days to weeks at the current rate I'm seeing but it shouldn't cost me anything so don't worry about that.
Update, I'm on revisions 20,000 out of 35,524.
Currently 872 revisions have failed to build using every-ts
, 78 of which seem to hang forever.
I'm not surprised that so many don't work; the repo only started merge via squash a few years ago.
Alright! It's done processing! The tail end was certainly quicker than the beginning.
Here's some information I found interesting:
I still have the logs and timing information for every individual commit but I haven't uploaded those because they're a lot larger and mostly boring. I'd of course be willing to share them if you'd find that useful. I'm also willing to retest the 2,270 commits iteratively until you bring the number of broken commits down. HOWEVER I fully understand how much manual labor it could take to deal with these 2,270 commits.
A simple approach to making every-ts
more durable to these commits that fail could be to blacklist all of these commits. That would entail stopping a user from manually switching to these commits and then during every-ts bisect
use git bisect skip
using the range notation to tell it to skip these commits that don't build.
Awesome! I really appreciate this data. I have some hope that I can fix some of these, but I suspect a bunch are just WIP commits from long ago (which a bisect with --first-parent
would skip).
I can certainly include a list of all broken commits and error if any are checked out, though I'd like to take a crack at them regardless.
I still have the logs and timing information for every individual commit but I haven't uploaded those because they're a lot larger and mostly boring. I'd of course be willing to share them if you'd find that useful.
I actually think they might be useful; if I'm going to fix certain classes of historical errors, it'd be nice to get them in buckets. E.g. the original report breaks due to:
src/server/typingsInstaller/nodeTypingsInstaller.ts(125,24): error TS2345: Argument of type '"message"' is not assignable to parameter of type 'Signals'.
So is a place where the repo was actually broken, and I could apply a patch for this kind of error by any
-ing something out. But, I don't know how many commits have this same error to know if it's worth it.
I also checked some of the ones that hang, and the first one I checked (34b9c4dbad569a21227e8761f785630c6052a855
) didn't; but it did have some compiler error that showed that it was sensitive to types that were installed within the node_modules
dir where every-ts
was, which is another reason to do #1.
Hey! Thanks for the great tool! Saved me plenty of time already finding which PRs break my code.
Anyways, I was running a bisect and I got this error
Unable to build typescript at rev 3d8cf62846c8bda828675239ddd9d4bd33e53d51; please file a bug!
Just in case something funky was happening with how I'd set up my bisect, I stopped my bisect and then ranevery-ts switch 3d8cf62846c8bda828675239ddd9d4bd33e53d51
and sure enough I got the error again.Here's the full logs:
I think I have the general gist of the problem figured out but I'd be unsure how to fix it. Hopefully this is enough info for you!