Open tbreloff opened 8 years ago
@tbreloff looks good. We should also have a legend that explains these tags and rating scale, else people other than the author wont grok them :) Feel free to ask other authors on the users/dev list to tag their packages. For those that dont want to, its fine by me.
I'll push the legend and scale tags a bit later today. Thanks
How about adding another tag for Licenses? Sometimes people simply dump code sans any license which, strictly speaking, is interpreted as non-free. Any code repository without a software license is basically not freely re-usable, except in private - Technically you cannot fork it ...etc.. Github had a handy link somewhere but I cant seem to find it.
With regards to the tags, can we replace the term 'Robust' with either 'Tests' or 'QA' (or simply 'Quality') which conveys the message directly imho.
Please see this sub-section that I updated with our discussion.
I thought some more and the Usability
and Activity
tags seem to overlap as far as the features go. What do you think?
Ah thanks for adding this section. It seems like you switched some of my notes among the categories, so yes they seem to overlap now.
My thinking with the 3 categories is this:
Usable: can I practically use this package today? (I.e. I can figure out how to use the repo and it actually does something useful) Robust: can I depend on this package for mission critical projects? (I.e. It is well tested) Active: can I expect this repo to improve in the future and keep up with changes in the language? (I.e. The package author(s) have time to maintain and improve in the future, or there's expectations of someone continuing that work, and there's reasonable expectation of someone responding to issues and PRs)
Do you like this separation?
On Tuesday, February 2, 2016, SVAKSHA notifications@github.com wrote:
Please see this sub-section https://github.com/svaksha/Julia.jl#status that I updated with our discussion. I thought some more and the Usability and Activity tags seem to overlap as far as the features go. What di you think?
— Reply to this email directly or view it on GitHub https://github.com/svaksha/Julia.jl/issues/55#issuecomment-178904453.
On Wed, Feb 3, 2016 at 12:56 AM, Tom Breloff notifications@github.com wrote:
Ah thanks for adding this section. It seems like you switched some of my notes among the categories, so yes they seem to overlap now.
On first read, some questions sounded similar but the explanation below clarifies it.
My thinking with the 3 categories is this:
Usable: can I practically use this package today? (I.e. I can figure out how to use the repo and it actually does something useful)
Would package stability and release cycles be taken into account?
Robust: can I depend on this package for mission critical projects? (I.e. It is well tested)
By well-tested do you refer to
Active: can I expect this repo to improve in the future and keep up with changes in the language? (I.e. The package author(s) have time to maintain and improve in the future, or there's expectations of someone continuing that work, and there's reasonable expectation of someone responding to issues and PRs)
I suspect this will be upbeat but the ground reality is that its a dismal figure for non-org maintained packages. Oh well, we can only try to get a honest shot at understanding the expectations I suppose.
Do you like this separation?
Much more clear. thanks, Hopefully, my questions are geared towards fine-tuning the tag grading.
SVAKSHA ॥ http://about.me/svaksha ॥
Would package stability and release cycles be taken into account?
Hmm... In my head this falls more into the "Active" category, but I suppose if it falls behind too much it is eventually "unusable". Just a thought... it would be nice to store the date the tag was updated, as that's helpful to know how usable a package might be...
- it being tested and maintained for different versions of julia releases? or
- It being tested and maintained for various platforms (Linux/Win/OSX) ?
These are either the "Active" category, or maybe a 4th category yet-to-be-named? When I think "Robust", I think "I can trust this thing to behave as expected... no surprises". It might only work on one system with specific versions of every library, but it will work. My background is in algorithmic trading. I sometimes kept systems unchanged for many months to be absolutely sure I wasn't going to be surprised (and lose lots of money in the process).
I suppose the Active category could be split into "keep it working" vs "add features", as some projects will not improve, but it takes active maintenance to keep up with changes to Julia itself. However, I think it's asking a bit much for that much specificity. You can assume that a 5 means it'll work in the future, and a 1 means it won't.
Oh well, we can only try to get a honest shot at understanding the expectations
Exactly... it's ok if most of these repos get a 1 in the active category. The important thing is that someone is able to look through the list and know immediately that some package probably won't suit their needs.
I've had the thought that all of this info belongs in a database (or json, csv, etc) so that one could possibly filter/sort the lists. It might be possible to programmatically scrape the existing markdown to put all this in one place. If I get a chance in the next couple of days I might take a pass at that. A bonus is that we could populate a Google doc with the table and let people fill in tags and make notes without going through the PR process.
Hi,
Sorry, I've been busy and still am, but quickly...
On Wed, Feb 3, 2016 at 1:46 AM, Tom Breloff notifications@github.com wrote:
Hmm... In my head this falls more into the "Active" category, but I suppose if it falls behind too much it is eventually "unusable". Just a thought... it would be nice to store the date the tag was updated, as that's helpful to know how usable a package might be...
Do we know if package authors are willing to do this in such detail?
These are either the "Active" category, or maybe a 4th category yet-to-be-named? When I think "Robust", I think "I can trust this thing to behave as expected... no surprises". It might only work on one system with specific versions of every library, but it will work. My background is in algorithmic trading. I sometimes kept systems unchanged for many months to be absolutely sure I wasn't going to be surprised (and lose lots of money in the process).
I suppose the Active category could be split into "keep it working" vs "add features", as some projects will not improve, but it takes active maintenance to keep up with changes to Julia itself. However, I think it's asking a bit much for that much specificity. You can assume that a 5 means it'll work in the future, and a 1 means it won't.
Oh well, we can only try to get a honest shot at understanding the expectations
IMO, it will never be exact. Software changes all the time (as it should, else we will be out of jobs) so its harder to unofficially label with too much specific detail unless the author or core-dev is willing to provide the status.
Exactly... it's ok if most of these repos get a 1 in the active category. The important thing is that someone is able to look through the list and know immediately that some package probably won't suit their needs.
I've had the thought that all of this info belongs in a database (or json, csv, etc) so that one could possibly filter/sort the lists. It might be possible to programmatically scrape the existing markdown to put all this in one place. If I get a chance in the next couple of days I might take a pass at that. A bonus is that we could populate a Google doc with the table and let people fill in tags and make notes without going through the PR process.
Please feel free to go ahead with this. If you announce it on the users list, authors who wish to tag their package can provide the status data too.
Best, SVAKSHA ॥ http://about.me/svaksha ॥
Continuing the discussion from https://github.com/svaksha/Julia.jl/commit/a884fe9e921d57b87d85e970c2f57b8f21025641. Here are some proposed formats:
I think example 2 is best. The question is what is best to put in the brackets. I think there are maybe 3 categories that we could consider tracking:
I think that a 1-5 numeric scale (5 is better) for each category would go a long way to describe the state of the package, and help to quickly narrow down a package search when users are browsing the lists.
Here's a sample (I'll create a PR when we decide on format):
Let me know what you think!