Open wycats opened 8 years ago
Always remember:
This is great @wycats. As you pointed out, our immediate focus should be the first thing you mentioned: getting broad compatibility with as many packages as possible. Do you have a plan on how we should approach this? I was thinking we could take the most popular GitHub projects and sanity check that yarn will work with them.
You could write a script to test out all of these https://gist.github.com/anvaka/8e8fa57c7ee1350e3491#top-1000-most-depended-upon-packages
This is great @wycats. As you pointed out, our immediate focus should be the first thing you mentioned: getting broad compatibility with as many packages as possible. Do you have a plan on how we should approach this?
I think we probably need a multi-pronged approach, because there's a few different kind of compatibility we're testing.
First of all, @thejameskyle, we should make sure that for the time being the website has a prominent place for people to report incompatible packages (it can just be a link to Github Issues) so that people know that we're serious about compatibility. We should also triage these reports quickly so people don't come to the repo, see a bunch of stale compatibility reports, and move on.
In terms of the packages themselves, I think we need a few different smoke tests:
yarn add
the package to a new application? If not, which phase is the problem related to (I'd wager that obscure assumptions in install scripts will ultimately make up most of these, but who knows).--flat
mode? If a package, on its own, doesn't work flat, it would probably be worth our time to help fix the dependency graph of that package so that it does work (but it's not necessarily an emergency; I would ultimately be nice if most people most of the time could use --flat or some variation of --flat without thinking about it)."scripts"
after running kpm install
? This will probably be hard to fully automate, since some scripts might have dependencies related to the workflow their developer uses (or machines their developer uses), but it shouldn't be too bad to whitelist a list of scripts per package we test (especially for the top-100).I've created a tracking issue for projects that people think we should smoke test; please fill it in with projects you think we'd benefit from testing and tracking.
Also, are there any good testing steps I missed here? I bet there are a few 😄
yarnpkg/website#16
I'm tackling this for webpack
as we speak.
Our current local dev workflow to ensure dep's are correctly installed and tests run are:
$ npm install && npm link && npm link webpack
I replaced this with
$ yarn install && yarn link && yarn link webpack
And all of our tests have passed and install was nice and fast. 🍾
Would I then check our yarn.lock
into source control if I am going to now check these changes in for our install scripts?
Yeah, that seems fine. And yes, please check in the yarn.lock
file. See https://yarnpkg.com/en/docs/migrating-from-npm
Thank you very much!!! I've been hunting for this doc.
Re the original issue: I was thinking that overall we should let the community guide us and we should prioritize their issues over any theoretical 100 % compatibility measure. People are filing bugs left and right but most things are working fine out of the box.
Before we go public, I think it makes sense for us to get clear about what compatibility means to this project, so we can be clear to our users.
As a straw man, let me throw out these categories:
Existing packages in the wild (npm, but also git repos etc.) install
To me, this is the #1 compatibility risk. If people find that
yarn
is faster and more reliable, but there are common reports of packages not working, they'll be very unlikely to give it a try.I've run into a number of people who say they've tried out various "faster npms" and gave up quickly because of the number of bug reports of packages not working. We'll assuredly have some of these, but I think we need to communicate that these are five-alarm fires for us, that we've already done a huge amount of compatibility work (showing examples would help here) and rapidly fix any bugs we find with in-the-wild packages.
Running
yarn install
in the root of my app gives me a running appAt a basic level, this needs to "just work". Most people are going to try out
yarn
by installing it and running it in their app, and then expecting the rest of their workflow to continue working as expected.For vanilla
package.json
s with noyarn
features, runningyarn install
should get people a directory that they can use as before, but faster and more reliably. To the extent that this is not true, I think we'll want to fix things very quickly.However, there are likely to be minor differences in the workflow, especially once
yarn
features get into the mix. I think it's acceptable for us to nudge people some, as long as the error messages and output are very clear about what's happening.A lot of people will end up using
yarn
because their colleague added it to a shared project, and we want to make sure that the workflow feels good to people who gotyarn
foisted on them.I think the current philosophy of the project matches this requirement very well, and we should make sure that as bug reports roll in, that continues to be the case.
yarn ${some npm command}
works the same asnpm ${some npm command}
I think this is a softer constraint; to the extent that commands overlap, we will want to help people deal with muscle-memory mistakes. We'll also want to avoid gratuitous changes to the commands (Sebastian has done a great job here).
But I don't think we need to generally promise full command compatibility, especially because some of the new features (lockfile in particular) have interactions with the commands.
So in short: we should try not to make gratuitous changes, and we should make sure that errors guide people in the right direction, but we shouldn't (and can't) promise total compatibility.
The workflow I was using before still works
This one is also a soft constraint, but an important one. The reason I say it's soft is that there are many ways to address workflow breakage. For example, if we have a lot of TypeScript people complaining that some detail of the node module resolution strategy they're using doesn't work with
yarn
, we might fix it inyarn
, or we might discover that they're doing something dubious and help them move to a more portable strategy (if this happens, it would probably help people with future npm upgrades and also alternative clients).That said, we definitely need to avoid losing users because basic workflows they've come to expect don't work anymore, but should avoid promising perfect compatibility.
One of the reasons I'm a little circumspect here is that there's a lot of emergent behavior in npm, and while it may be tempting to say "we promise to give you a compatible node_modules layout", it's hard to be sure that there aren't weirdnesses that would make that difficult (or unacceptably slow just to satisfy a minor edge-case we could address in some other way).
That said, to a first approximation we should try to be very compatible with the
node_modules
layout for people who need that compatibility, because it'll be the most likely way to avoid long-tail compatibility problems.The good news is that npm@3 is such a large departure in terms of file system layout than npm@2 that the npm team already did a lot of the hard work for us here.
Those are some of my initial thoughts. Very interested in hearing what other people are thinking on this topic.