Closed kweber-tcc closed 4 years ago
good point (except less than 100 stars
)
I find it relevant to this question to improve the quality of the list.
Star in Repository does not demonstrate the quality of the project, today we review the proposals asking for the following items:
I think it's important to ask for minimum X stars, my vote is for 42 stars ("The answer to the ultimate question of life, the universe and everything is 42")
What's your opinion? @cassiobotaro @felipeweb @crgimenes @shurcooL @dukex @matrixik @joeybloggs @kirillDanshin @appleboy @PotHix @campoy @ianlancetaylor
More reviews thread: https://twitter.com/avelino0/status/918520765028491265
I do not like this measure by stars. Tests and documentation it's better to this...
Maybe having a good score on goreportcard as well?
I think it's important to ask for minimum X stars, my vote is for 42 stars ("The answer to the ultimate question of life, the universe and everything is 42")
What's your opinion?
Number of stars is a measure of how interesting a library is multiplied by how many people have seen it. If a good library is created and not yet shared everywhere, it won't have many stars. That doesn't mean it's not good, just that it hasn't been shared/seen by many people yet.
For example, https://github.com/golang/arch is an official Go repository that has 17 stars at this moment.
I don't think there should be a minimum star requirement, because stars are not a measure of quality, they're a measure of interest and exposure.
Also, not all Go packages are on GitHub.
It seems very contrived to correlate the quality of the package/software to the interest in it (stars). We've seen projects on GitHub get a lot of stars, very rapidly, when they hit HackerNews... even if that project isn't the best solution. I don't think a Public Relations explosion of a project, that's a week old should, be an indicator of its quality.
Maybe we could form as a small group of individuals in the community to help curate the content? We could use the Golang Slack group, in a public channel, to make the decisions and process transparent. I suppose we could also use GitHub pull requests to accomplish the same.
If we could start with some manual curation, we might be able to find patterns in the projects we've curated to try and make it more automated.
It's tricky. Sometimes a large number of stars is just a by-product of good social engineering (if it's on the front page of Hacker News, it has to be good, right?)
That said, there is definitely a critical mass somewhere that would clearly indicate that a package is not only popular, but also useful and relatively bug-free. I have no idea what that threshold might be (probably 500+ in my opinion).
But that leaves less-widely used packages written by lesser-known developers in a bit of a bind. I have seen some amazing packages with 10 or fewer stars that definitely deserve to be on your list.
I think at the very least, the following things must be included in order for a curator to even consider a package:
I think you should also add all the tiles (coverage, goreportcard, godoc) as links beside every entry on your list.
This would be a lot of copypasta work, but I'd be glad to help with it ;)
Another factor to keep in mind is that by setting criteria for inclusion, we influence the kinds of things people are more likely to do. If we make it good things like having documentation, tests, no lint issues, etc., people can fix those.
If we make it a requirement of having minimum N stars, we'll be causing people going out of their way to seek out stars, likely by annoying others and spamming their repository everywhere. I don't want us to have that effect.
I'm with @shurcooL we should require only things that can be directly affected by repo owners.
Stars are not a good metric. Maybe a proposal to change coverage when project is hard to test, but i'm really afraid about that.
stars != quality agreed
In my experience 99% of the time a project is hard to test it’s because of the way it’s composed and can be better refactored to support tests; having said that there are edge cases but those should be treated on a case by case basis IMO
Number of imports is also valid metrics for library type packages.
@dvrkps No, and will never be. You forgot about thousands of closed-source commercial projects.
@kirillDanshin yap, you are right. i forgot that :)
By the majority vote we see that star is not a good metric, closed the issue.
Hi, folks. I have something to add to this one. Looks like this issue is not just related to the number of stars but to improve the curated list. Is the go-fann library (just one example) he mentioned according to our current standards? In case it's not, the issue is still valid.
@avelino :point_up:
That library was added in 2014 directly into master without a PR, see 5bf6e084a46a82bcc8e8433ef6ddca14af08998c. At that time, the standard for inclusion was probably quite different.
If there are issues with individual libraries, they can be dealt with specifically.
I don't think this argument invalidates the content of the issue. The issue asks to curate the list. In case it was valid in the past and it's not valid anymore, it should be removed. Isn't that the case?
Maybe this issue is too broad for that and we can split into multiple issues or get help for that, but the content is still valid.
I'm just saying that a good reason for closing would be: "We will work (or wait for PRs) to remove the libraries that are not awesome (according to our standards) anymore".
I didn't say the content of this issue is invalid, I just dug into the history of how that package was added and shared my findings.
Maybe this issue is too broad for that and we can split into multiple issues or get help for that, but the content is still valid.
I think that'd be more actionable.
for what its worth, I think it would be helpful for the maintenance of the README
if we added the badges to each project, something along the lines of:
ORM
Libraries that implement Object-Relational Mapping or datamapping techniques.
name description report card go doc coverage marlow orm generator
also was thinking that this badge can be placed in the README
of projects who have been added to the list:
Great idea @dadleyy! It would help a lot to keep the list in a good state.
We could also schedule some monthly cleaning for those who don't comply with our standards.
README loading can be very slow, and badges have cache. I believe that people who choose a library without quality standards should report for us. People are always better than technology.
I believe that people who choose a library without quality standards should report for us.
In this case, we will make people choose a library from an awesome list just to find out it's not awesome. It doesn't look a good UX for me.
README loading can be very slow
We should try a PoC first just to understand the size of the problem.
@PotHix See #1368, #1213, #1266 for discussion/implementation of a similar idea (showing star counts) in the past.
I'll note that what I wrote in https://github.com/avelino/awesome-go/pull/1368#issuecomment-294267472 applies here too:
I'll just briefly mention that if we were to consider making this kind of change, we would need to carefully look at the performance characteristics of the implementation.
There are over 1000 repositories listed in awesome-go, which is 1000+ badges. There are thousands of people viewing the awesome-go list daily. If the implementation is inefficient (i.e., no caching or inefficient caching, etc.), we could inadvertently DDoS github's API or the badge provider.
I haven't considered the performance of the current implementation, so I can't say if this is a problem or not. I'm just saying it's something we would need to do before ever merging such a feature.
Thanks for the reference @shurcooL! It makes sense. :)
I did a simple search and replace to use just one badge and these are the results:
As stated, it's not viable. :)
We just have to think about the best way to curate the list. Maintaining an awesome list and expecting people to report that many of our libraries are not awesome is like building a software and expecting the user to report all problems.
We may create a script to check the metrics for us and generate a CSV file so we can easily clean up the projects that don't meet our standards. What do you think?
A better metric may be to show how many projects import the packages. For full applications, then stars or some download count may be better.
Imports count doesn’t work because of private projects; this was already mentioned above.
I think @bketelsen might be interested on this issue.
Already discussed: Number of stars and number of imports aren't good metrics. Specially when awesome-go is used to actually discover awesome Go projects, which means that new projects that are high quality could use the exposure at awesome-go to actually get imported and starred when appropriate.
I would love to help curate this list.
@issuehuntfest has funded $100.00 to this issue. See it on IssueHunt
@avelino Can I take this task?
So, what is the final metric?
@rororofff has funded $10.00 to this issue.
If nobody is on this task I could help by starting a Go service which would quality check all GitHub projects linked in an MD file every 24 hours.
If I'm not mistaken with any of the metrics, It would be something like: 1) Run goreportcard on project and fail if:
go test -cover
through a bash script and get the results in Go and fail if coverage is less than 70% and the report card is not an A+Any thoughts?
Some packages have good reason why they can't get that level of code coverage
Some packages have good reason why they can't get that level of code coverage
@pjebs evolve your affirmation, describe the reason, only then can we update our contribution document We analyze all the PR humanely, the projects that come out of this pattern we study case by case
I think at the very least, the following things must be included in order for a curator to even consider a package:
* Code coverage of 85% or better * Goreportcard score of B+ or better, with tile at top of README * Godoc reference tile at top of README * Travis (or other CI) build passing for current release, with tile at top of README
I think you should also add all the tiles (coverage, goreportcard, godoc) as links beside every entry on your list.
Hey there! I got some free time and can curate the list. I will go for this measures as of now. I will modify the test coverage to be 75% for now though and will exclude the need of badges. This should remove a huge amount of libraries at first. PRs can be created to add the badges to their readme's after that.
Are you guys ok with that? @avelino? I can write a small tool to do it automatically so doing periodic updates like someone mentioned in #2649 could be possible.
What is going on with this list? The pull requests are piling up and many libraries are no longer maintained. The overall state of this list is rapidly decaying. It seems as if there are plenty of people who are willing to help - what's going to happen?
There are some comments in #2718 about the issue of maintainer activity. Basically, there are only 2 or 3 people who actually still work on this list on a regular basis. If they don't find the time to do so, we end up in this exact situation. I'd like to add some more people, but unfortunately, i do not have the permissions to do so.
See https://github.com/avelino/awesome-go/pull/2734 for an attempt to clean up some subpar packages.
I am curious about projects that haven't been updated in years. If no one is maintaining them, should they still be included on the list?
It's a delicate subject, it has a project that hasn't been updated for many years and works super well.
But it would be interesting to have software to help us identify these projects and open an issue automatically to understand the time of the project, if the issue is not answered in X time with positive comment could be removed.
@mrKappen do you have any proposal to eat can do this?
I think if a project has not received any commits for over a year, it might be a good idea to at least investigate it (perhaps new security vulnerabilities might have popped up). I like the idea of running a script which would notify you of such projects and automatically create an issue to investigate. Where might be a good place to host such a script?
@mrKappen can host at the awesome-go repository (at the root)
Would it make sense to add this as a test to repo_tests.go? If a listed repository fails a certain condition (ex. time from latest commit > 1 year) then fail?
Would it make sense to add this as a test to repo_tests.go? If a listed repository fails a certain condition (ex. time from latest commit > 1 year) then fail?
I don't think it's good to put it in repo_tests.go
so we don't run each PR, we'll do it in a new binary (ex last_commit.go
, but write test 😄 ), so I put it to run periodically in a server.
Some packages are featured complete and considered stable, hence no commits such as my https://github.com/rocketlaunchr/remember-go . I won't be updating it.
Hey @avelino I added a test script (not in repo_test.go) which goes through the list of repositories and creates an issue to investigate if the repository has not had a commit in over a year. I would love your feedback
Hey @avelino I added a test script (not in repo_test.go) which goes through the list of repositories and creates an issue to investigate if the repository has not had a commit in over a year. I would love your feedback
I've created a new issue to discuss this new feature, let's move this communication there #3211
This seems to me to be a list of any Go library that is submitted. There are MANY libraries included with less than 100 stars and absolutely no documentation. (Examples: https://github.com/siddontang/go-log, https://github.com/white-pony/go-fann, https://github.com/daviddengcn/go-pr, https://github.com/e-dard/godist) Many libraries have no tests. Many are no even stable yet.
What is the point of a curated list that accepts anything submitted? How is this a list of awesome Go code? Why are you not even adhering to the quality standards set out in your own README?
Please remove the less-than-awesome libraries, so this is actually an awesome list.