Open dlsniper opened 8 years ago
I would really love this to become real. This would spare the Go community the pain of the NPM world where it's far easier and quicker to write a simple NPM module than to find a high quality one: The other kind of JavaScript fatigue
I foresee some problems having Go packages reviewed by the community.
By nature people aren't objective and inconsistent when it comes to ranking.
There are +120k package on godoc.org, not all need to be reviewed, but that's a lot of man hours. As the Go community flourishes the community grows larger and with it the number of packages.
Gamification and moderation is a hard nut to crack, Stack Overflow is the exception that proves the rule.
Go has an exceptional AST which resulted in many helpful linters and the gometalinter. These can be used to rank Go packages along with other metrics. Bots can be used to generate objective reviews.
Ranking metrics:
I'm in favor of an objective and automated package ranking system as proposed in golang/gddo#320 (Proposal for Package Ranking).
By nature people aren't objective and inconsistent when it comes to ranking.
Thus this should be about ranking packages but rather reviewing them. And probably we don't want only a single review to agree on a problem, but maybe two or three.
There are +120k package on godoc.org, not all need to be reviewed, but that's a lot of man hours. As the Go community flourishes the community grows larger and with it the number of packages.
For those interested in doing the review that shouldn't necessarily be a problem and the point is not to make the reviewers a group of 20 gophers but a group of 20 gophers which grows over time to hundreds or more.
And yes, not every package will need a review. But when you get asked: which package should I choose for TOML or search for it in godoc it's going to be easier to say: that package, it has also been reviewed by the community and people think it's good.
Go has an exceptional AST which resulted in many helpful linters and the gometalinter. Why have the community review packages when robots can do a better job and do it faster.
Robots cannot judge the quality of the code from an idiomatic perspective. They can judge the quality only from a: is this going to compile and obvious mistakes that can be codified will be avoided?
I'm in favor of an objective and automated package ranking system as proposed in golang/gddo#320 (Proposal for Package Ranking).
While I like that proposal as a tool to help checking if basic quality is meet we still won't have a good indicator if the code itself is rubbish or not.
Hi,
As a follow-up of: https://github.com/golang/go/issues/17244#issuecomment-251178782 I'd like to open here this request for comments.
Current problems
One of them is finding packages which are considered to be stable, idiomatic or otherwise good for usage. There are plenty out there and while the common knowledge will build up to that, especially for beginners this is hard to find out. Plus you'd have to trust a random person on the Internet telling you what's good what's not, and I'm sure to have the best example of when that goes wrong.
Github stars are also largely irrelevant because they ultimately can be faked, people that have no clue what the repository does or the quality of code it has can just star it (some use stars as bookmarks) so that can have dramatic consequences when people are pushed to think that the repository is of good quality when in fact it's not.
Yet another problem is that people put too much pressure on the Go team / contributors to review their stuff and somehow bless implementations in order for users to feel safe to use the code base. This clearly is not scalable and I'd much rather have the Go team do what they do best, work on the language and tools around it and do reviews whenever they feel like it (if ever).
So, in order to fix these problems we need to somehow be able to identify and mark Go repositories as being: "community reviewed", "idiomatic Go" or whatever fitting label we find for it.
The proposal
We should add a flag to projects that would show them in as such on godoc.org and maybe have a badge for them to add to their readme.
The reviewers should be part of the community and it should work on a trusted model and only reviewers can add new reviewers.
For an issue to be flagged as an issue at least a few reviewers would need to agree on it and the reviewers could get some form of recognition (karma) for having the most accurate reviews, basically gamify this but not for the number of repos reviewed or for the owners of the repositories but for the reviewers themselves and the quality of issues found.
We can start with a list of know gophers that can review packages or add reviewers they trust for writing high quality code / doing good reviews.
Why would this work?
I'll take the example of Gophers Slack where we have a channel dedicated to reviews https://gophers.slack.com/messages/reviews/ and we also have a lot of other channels (both in Slack but outside as well, like Reddit) and people ask for reviews on their repositories. Integrating this with godoc would benefit everyone involved and the whole eco-system would be better at this.
Conclusion
I'm looking forward to hear your thoughts on this.
Thank you.