wbond / package_control

The Sublime Text package manager
https://packagecontrol.io
4.8k stars 814 forks source link

packages.json Support for Specifying Compatible ST Versions #291

Closed wbond closed 11 years ago

wbond commented 11 years ago

With ST3 beta out now, we need to allow users to specify what versions of ST packages work with.

I am inclined to say the only way to support ST3 is for plugin authors to set up a packages.json file. This will mean that all of the authors who use bare github repos will only be available to ST2 users.

I don't think it will be too big of a deal for users who have to port their plugin code to also add a json file to their repo. This will also encourage package author to use semantic versioning instead of date stamps.

Anyone with specific ideas about the specifics of the json for listing compatible versions?

timjrobinson commented 11 years ago

I'd say structure it as closely as possible to the npm package.json format, many are already familiar with that and it's clear and easy to understand.

w00fz commented 11 years ago

I think a package.json is the best approach. Perhaps one suggestion would be to have it handling packages that support different ST versions. For instance, if you were to follow the npm format, I'd extend the repository attribute to include the ST version dependency.

"repository": { 
    "type" : "git",
    "versions": [{
        ">= 2 < 3":{
            "url" : "http://github.com/user/package"
        },
        "> 3":{
            "url" : "http://github.com/user/package/tree/branch-st3"
        }
    }]
}

Something along those lines, where the versions refer to the sublime versions, could even just be 2 or 3. This way a developer could maintain different versions of Sublime (usually stable and dev) from the same package.json and would make future ST transitions easier to handle from a Package Control standpoint.

Just an idea.

wbond commented 11 years ago

Just in case you guys have not seen it, there is already a packages.json format for Package Control. https://github.com/wbond/sublime_package_control/blob/master/example-packages.json

w00fz commented 11 years ago

Ah sorry. So what about this?

"platforms": {
    "*": [
        {
            "version": "1.1",
            "client": "st2",
            "url": "http://nodeload.github.com/user/project/archive/v1.1.zip"
        },
        {
            "version": "0.1-dev",
            "client": "st3",
            "url": "http://nodeload.github.com/user/project/archive/branch.zip"
        }
    ]
}

So for each platform type, you have multiple choices that are related to the ST client version and Package Control will know which one to serve to the user. Of course it doesn't have to be st2 or st3 but you get the idea. You could also assume that if the client is missing, it is meant for the current stable version of ST, which should guarantee some sort of backward compatibility for the current status.

kblomqvist commented 11 years ago

Could it be possible to use packages.json to list additional plugins that the plugin is dependent? To get those installed as well if needed.

kostajh commented 11 years ago

Does this mean plugin authors will have to tag new releases and update the packages.json file each time they make an update to a plugin? The existing system of using date stamps for showing updates for plugins is nice for its simplicity.

wbond commented 11 years ago

@kostajh That is my current thinking right now.

The only options I see right now are:

  1. Present packages to ST3 users even if they have never been tested. Installing a package will be a crapshoot on ST3.
  2. Try to find someway that through a github repo the developer can indicate if a package works with ST3. The problem here is that I am already consistently hitting the GitHub API rate limits and have been reducing the frequency of updates because of the number of API requests made. Right now each package requires at least 2. Adding a third would make it so I can only updates packages once an hour, but in the near future I wouldn't be able to get all packages in under the hourly rate limit.
  3. Require developers start providing a packages.json file. We could perhaps augment Package Control to make this easier, especially doing such things as bumping revisions.
kostajh commented 11 years ago

That makes sense. Given the options, the third one is the best.

FichteFoll commented 11 years ago
  1. Is a no-go. What could be done is a setting that makes unmarked packages show and install on ST3 too (like addons in FireFox), this is of course only if you know what you're doing.
  2. What exactly are the two API requests on this? You must be fetching the user, the last modified date (for version) and the description so maybe having the user specify ST3 support in the description would solve it. Other than that it would most probably need some limit raise.
  3. What I think is the most annoying thing about custom JSON is the modified date. I don't even know how it works in combination with the version. Do you even check the date if you check for the version? Or do you update if the version number did not increase but the date did, e.g. for hotfixes? It's probably in the source but I didn't bother to dig into it. Other than that, it seems to be the "most secure" way to do things without using hacks (2.) or guesslation (1.). I mean, if you are a package developer you should be known to JSON. And if you update your package for ST3 it should not be that much of a problem to add some additional file with information about it. What does concern me is the amount of updates the repository channel requires. I mean, every plugin that uses the Github or Bitbucket URL implementation has to create a packages.json file and exchange the old entry with the new.

Regarding packages.json formats, I'd probably stick with build numbers since they are really reliable and allow quick adjustment even for small API changes. So, if you specify a minimum and a maximum build number (st3 probably starts at 3000, first build published being 3006) Package Control can also disable packages (with a notice, of course) if a package is not compatible with a recent API change and does not leave the user with a broken plugin.

skuroda commented 11 years ago

Out of curiosity, how would you handle plugins without package.json files? I'm thinking in the context of repositories added by users. Some plugins may be written without the end goal of adding to package control. Now, I know I can clone these into the packages directory, and have package control do updates via git, but sometimes I don't feel like it. In those cases, I add the repository to package control and install normally.

With that being said, I don't think you can rely solely on packages.json files, though it may be as simple as always including user added repos regardless of the presence of that file. Perhaps that's how package control handles it now, I haven't looked. Just thought I would bring this up on the off chance it hadn't been considered already.

Thanks for all the great work.

jbeales commented 11 years ago

Would it take another API call to see if there's a file called st3-compatible in the repo? That way plugin devs could flag their plugins as st3-compatible, but you could still update based on last-modified dates and devs wouldn't have to create a packages.json if they didn't want to.

wbond commented 11 years ago

@jbeales Yeah, that was option 2 above. We would move to about 3000 GitHub api requests per refresh meaning the current 30 minute refresh rate go up to every 60 minutes. Then within 6 months (maybe before ST3 stable is released), we get to 1700 packages and now we are over 5,000 API requests per hour.

By not adding a third API request we can scale up to about 2500 packages not using a packages.json file before we hit the limit. However, by making it a requirement to have a packages.json file for ST3 compatbility, we effectively eliminate new packages that require API requests.

wbond commented 11 years ago

In light of the last comment and for consideration moving forward as this scales out, we might need to move to an architecture where the channel server does real git clone/git pull operations on all packages and delivers the .sublime-package files itself.

This would be a bigger undertaking and would require more server resources on my part. Even after gzipping all requests leaving my server, I am serving close to 600GB of JSON a month. If I was delivering package files, I can only imagine I would jump into 2TB+ of bandwidth a month. For that kind of bandwidth we are probably looking at a dedicated server with a $200 a month bill, minimum. The cheapest cloud/VPS bandwidth I've seen is $0.10 a GB. Moving to a dedicated server would make the server more vulnerable to hardware outages.

This architecture would have the benefit of not:

  1. Using GitHub as a fileserver

Ideally it would also have the benefit of simplification of the PC internals since we don't have to extract GitHub zips and remove the extra root folder. However, that would mean users would no longer be able to add custom repos, which is kind of a non-starter.

The downsides are:

  1. Significant ongoing cost
  2. More beefy server/bandwidth, maybe even multiple servers
wbond commented 11 years ago

@skuroda Yeah, I think the packages.json file would just be for the channel server to present the info. I think if PC connected directly to a GitHub repo, we would list it as available.

jbeales commented 11 years ago

If you're already pulling the packages.json file, (or trying to and getting an error back if there isn't one - I haven't looked in the code, and probably wouldn't understand most of the python anyway, so I'm not sure exactly how it works), then would it be an option to include some sort of ST3-compatibility flag in packages.json, but not require anything else, so that devs could put up a packages.json that exists only to flag the plugin as ST3-compatibile?

Regarding the bandwidth/cost/scalability issue, you could also go the other way and move as much as possible to GitHub. I'm not sure how it works now, but what if your server only gave the plugin a list of github repos, then it was up to the plugin to go, look at the repos, and figure out what, if anything can be downloaded &/or updated, (or is this what you're doing now?). This may require that we all have a working copy of git.

You could maybe take that even farther and instead of running your own server, put the list of repos up on github, as a repo itself, (and be pretty meta-tastic). Then you'll serve nothing.

The downside to all this is that the Package Control itself will get a lot more complicated, especially if you do something like bundle git with it.

wbond commented 11 years ago

@jbeales Originally PC worked as you describe. However with 1000 packages, it takes 15 minutes to crawl the GitHub API to get info on them all. Plus, as of about 6 months ago, the only way to get 5,000 API requests an hour is to be authenticated. Non-auth'ed users get 60 an hour. So that would users would have to sign up for a GitHub account, and wait 15 minutes before they could pick the package to install. :-)

So right, the initial discussion here is/was about adding something to the packages.json file to specify compatibility. The issue is that probably at most 10% of packages use the packages.json method, and there are some annoyances with it, such as having to tag the repo, update a timestamp, update a version number and update a URL for every release.

The other thing is, I don't look for a packages.json file in each repo. Developers either submit their repo URL, or the URL to a packages.json file. So, again, developer will have to change their workflow to handle that. Currently most developers just make a fix and push it. Then the channel server generates a version number based on the last commit timestamp. Effectively the developers get "free" deployment of all of their packages that way.

jbeales commented 11 years ago

Aahh, I see the problems. What are you grabbing from github for each plugin? Is there something that could be inspected, somehow, for some sort of a flag, or compatibility number, (like an svn property - does git have those?) If not, I think we're back to square one.

wbond commented 11 years ago

I don't believe there is any way to stick it in here: https://api.github.com/repos/wbond/sublime_package_control

FichteFoll commented 11 years ago

I can imagine making the plugin dev able to ommit some keys for packages.json. For example, if you want to proceed with the currect workflow and enable st3 support, you would have to create a file with only the first 4 lines - , + } } from the following example:

{
    "schema_version": "1.3",
    "repositories": {
        "GitHub Example": { "url": "https://github.com/john_smith/github_example", "st_min_version": 3000 },

        "BitBucket Example": {
           "url": "https://bitbucket.org/john_smith/bitbucket_example",
           "st_min_version": 2117, "st_max_version": 2999,
           "platforms": ["windows", "linux"]
        }
    },
    "packages": [
        {
            "name": "Tortoise",
            "description": "Keyboard shortcuts and menu entries to execute TortoiseSVN, TortoiseHg and TortoiseGit commands",
            "author": "Will Bond",
            "homepage": "http://sublime.wbond.net",
            "last_modified": "2011-11-30 22:55:52",
            "platforms": {
                "windows": [
                    {
                        "version": "1.0",
                        "url": "http://sublime.wbond.net/Tortoise.sublime-package"
                    }
                ]   
            }
        }
    ],
    "renamed_packages": {
        "github_example": "Github Example"
    }
}

I think this example pretty much covers all the cases I can imagine (and btw I also removed the "package_name_map"). You could even make the package_control_channel json file accept this format (after quickly converting it) and e.g. let users insert their packages.json files into a "channels": [] list. By the way, the standard value for "st_max_version" would be 2999 if none of the two was defined. And the "platform" thing in repositories just came to my mind to reduce the necessity to create your own json if this was the only thing you'd need it for.

Let me know what you think.


Edit 13-02-07: Updated the "repositories" so it does not restrict package naming (and looks clearer).

jbeales commented 11 years ago

I don't see any way to get it into that API call. There would have to be a second call to see if there was a branch called st3 or something.

However, I just discovered, (and this could be a total hack), that if people want to submit ST3-compatible repos, they could change their repo URL to include a querystring, and the API seems to return the same info, just in a slightly different order. See: https://api.github.com/repos/wbond/sublime_package_control?stv=3 This could be a way for devs to manually tag their plugins as ST3-compatible.

Slightly OT again, about the rate limits, it seems like someone should write a github service, (https://github.com/github/github-services#readme), that lets your server know when there's an update. Then when devs set up their repos they should turn on that service, and you won't have to crawl github all the time.

nirix commented 11 years ago

A GitHub service like @jbeales suggested is a great idea and might be the way to go.

quarnster commented 11 years ago

If I was delivering package files, I can only imagine I would jump into 2TB+ of bandwidth a month. For that kind of bandwidth we are probably looking at a dedicated server with a $200 a month bill, minimum. The cheapest cloud/VPS bandwidth I've seen is $0.10 a GB. Moving to a dedicated server would make the server more vulnerable to hardware outages.

Here's a crazy thought: some sort of p2p architecture. I'm on a 100/100mb line with no data use limitations and would happily donate bandwidth and disk space when appropriate. I'm sure there are many others who wouldn't mind participating if it was possible to turn of and/or limit the bandwidth/disk space use.

jswartwood commented 11 years ago

@wbond do you have any data about how people are using the package definitions currently? Do authors often have a different url per platform?

Here is a draft of a possible new schema that would puts an equal focus on Sublime Text compatibility and platform compatibility. A new releases array could contain an arbitrary set of releases for given compatibility sets. Package Control could take the release with the highest version number that would match the system requirements.

{
  "schema_version": "2.0",
  "packages": [
    {
      "name": "SublimeLinter",
      "description": "Inline lint highlighting for the Sublime Text 2 editor",
      "author": "Kronuz, Aparajita Fishman, Jake Swartwood",
      "homepage": "http://github.com/SublimeLinter/SublimeLinter",
      "releases": [
        {
          "platforms": "*",
          "sublime_text": "*",
          "version": "1.6.12",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/v1.6.12"
        },
        {
          "platforms": [ "windows", "linux" ],
          "sublime_text": ">= 3",
          "version": "2.0.0",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/v2.0.0"
        }
      ]
    },
    {
      "name": "SublimeLinter Beta",
      "description": "This version is considered unstable; only install this package if you plan on testing",
      "author": "Kronuz, Aparajita Fishman, Jake Swartwood",
      "homepage": "https://github.com/SublimeLinter/SublimeLinter/tree/beta",
      "releases": [
        {
          "version": "1.7.0-beta.1",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/beta"
        }
      ]
    }
  ]
}

I would also recommend officially deprecating last_modified in favor of versioning; it always seemed a bit redundant to me.

The sublime_text version number checking should also follow semver for comparison. node-semver has one of the most robust parsers that I've seen (eg. it can do "3.2.x"), but python-semver could be leveraged directly and might be good enough.

jlegewie commented 11 years ago

Even when you manage to make it work with the current trafic, it should be future proof. Did you try to request a higher rate for GitHub's API? If that creates additional monthly costs, I am all for getting it financed by the community (maybe a way to make a monthly $2 contribution or something) or maybe Jon is willing to jump in.

If deployment is getting more complicated for the package developers, I would also suggest some Submit Plugin to Package Control command in ST3 that takes care of a lot of the work (e.g. generates a packages.json etc).

zchrykng commented 11 years ago

I was wondering how homebrew manages to handle their load. I believe everything in run through github, but the only data that is retrieved until actual install would be a equivalent to the package.json file. I think it would be possible for PC to run the same way.

wbond commented 11 years ago

@zchrykng

Well, homebrew is quite different:

  1. It only runs on OS X
  2. It uses Git for everything
  3. All package metadata is centralized in a git repo
  4. Someone has to write an install formula for every package that is included

Currently, the only thing that is served from my server is the repositories.json file. GitHub isn't really in the business of acting as a CDN to serve a static file 10M+ times a month.

The real solution to the problem is to set up a github service. That, however, probably takes at least as much work for each package developer as creating a packages.json file. I've created #329 to continue any discussion related to that.

wbond commented 11 years ago

@jswartwood

last_modified is used for the community package list webpage, so I think I'll keep it for now. Although I don't have any analytics about how many people view that sorting on the page.

Thanks for putting that proposal together. I am going to think about that some, but I like the idea of having a list of compatible platforms. The other concern is the update to the 2.0 schema and old versions of package control. Unfortunately it seems that a decent number of people run old versions. Perhaps we can update the packages.json to 2.0, but have the channel server translate it into a 1.3 version that would be backwards compatible with older versions of PC.

semver.py is currently bundled, but I need to fix a bug with comparing semantic versions with date-based versions.

schlamar commented 11 years ago

@zchrykng @wbond

Actually, I would say that the homebrew architecture can very easily adopted to PC:

And an important comment, which is not directly related to the above proposal: If I understand the previous discussion correctly, you regard every commit as a new release and upgrade to it? I really discourage this update strategy as in a DVCS world a commit doesn't mean it is a tested (and working) version. You should definitely enforce plugin developers to mark their (stable) releases.

wbond commented 11 years ago

@schlamar The load is WAY too high to centralize. We can't even keep up with brand new releases, let alone new versions of 1000+ packages. Personally, I can't keep up with all of the development work, support requests, infrastructure as it is right now. And that is even with @sentience basically doing all of the work reviewing packages for inclusion. A few people have expressed interest in helping with pull requests, but no one but @sentience has followed through.

I agree maintainers should want to use semantic versioning, and it is an option, but most developers of smaller packages choose not to. The bigger/more serious packages tend to head in the direction of explicit versioning, with tagged releases.

schlamar commented 11 years ago

Well, if you enforce explicit tagging of new versions I wouldn't say the load is this high (what numbers are we talking about?). I even would go this far and make one "main" repo for serious/bigger packages which is actively controlled by a few maintainers and a "sandbox" (or "staging") repository which can be written by every package contributor.

wbond commented 11 years ago

At this point, any additional load is too much since I can't keep up with the primary development and infrastructure elements. In terms of pull requests, check out https://github.com/wbond/package_control_channel/pulls. There are 60 sitting in the queue right now.

I probably have about 30+ hours of work in front of me to finish the ST3 port and work with everyone here to figure out a plan for marking packages as ST3 compatible.

The biggest help at this point is if we could iterate on the proposals from @FichteFoll and @jswartwood for improving the packages.json and repositories.json files to handle platform definitions better, and allow specifying the compatible versions of Sublime Text. I like how @FichteFoll allows some basic metadata for bare repos and I like how @jswartwood handles the ST version and compatible platforms.

I'm going to enable the Wiki so it can be used for a more central place for proposals.

FichteFoll commented 11 years ago

I think, using something like "st_version": ">3000" or "st_version": ">=3" works okay but it gets complicated with a maximum version (e.g. only st2-compatible plugins). Values below e.g. 1000 would be counted as version number (hard-coded build lookup table required).

I can only think of "st_version": "2200 < _ < 2220" or "st_version": ">2200 && <2220". The second allows even more complex version selectors for cases when a single build is not supported but it's also more difficult to parse.

zchrykng commented 11 years ago

What about something like:

{
"st_versions": [
    [3000, null],
    [2200, 2220]
]
}

I am really rusty on json, but I would think you could make 'pairs' of values, and treat them as compatible ranges, it seems easier to deal with then parsing logical and comparison operators from the text field.

You could use 'null' or some other value to stand for current.

Please let me know if I am off base.

jswartwood commented 11 years ago

@zchrykng comparison strings are common enough and parsed reasonably easily. The are also (typically) human readable.

@FichteFoll the range problem is not a new one. Node-semver handles it fairly elegantly; this is my favorite version comparison parser (but would obviously need to be ported to Python). Pip dependency requirements allow slightly more complicated selections. Unfortunately Python-semver seems a bit too simple, but perhaps the author would be open to a pull to support ranges.

Although my first instinct was to allow both ST's release version numbers and build numbers (similar to what @FichteFoll mentioned), this might lead to confusion and complication. Since the Sublime Text API returns the build number as the sublime.version(), we may want to stick with only build numbers in the version comparison strings.

jswartwood commented 11 years ago

@wbond, you seem to want to keep the barrier of entry rather low for adding a package, but have you though about creating a basic API for "publishing" to PC? It may potentially slow and complicate the release process, but it would likely avoid your GitHub limit issues.

As a rough draft flow: A developer would create a package_control.json file and register the plugin once initially. For each release, they would update the package_control.json and run a "publish" command. Publish would validate their json and push the changes up to your server. Ideally, one might be able to do this within ST, but I'm not sure how that could work (perhaps it would have to be within the ST console).

With something like that, would you even have to hit GH at all?

nirix commented 11 years ago

With a service hook, people could just create a tag for each version and push it to their repository, GitHub would then send that information to the PC server that would process the information.

jswartwood commented 11 years ago

@nirix clever. It does appear that setting up a webhook is a pretty simple process. @wbond, it might be even easier (single click) for a developer if you could find a way to register to be on the GH service hooks list.

@nirix, I'm not sure that all tags would necessarily be safe to pull down into PC (even if you did do a version comparison, etc). I still like the package_control.json. It also keeps standardization outside of GitHub.

You could always test the body of the webhook post for changes to package_control.json and only then query GH for that package's updates. It would change the all packages every X minutes to any package any time (likely very rarely, but still worth throttling individual repos).

@wbond, I just want to go on record: As a package dev, I don't mind putting in a little bit of work for PC. I can't imagine you would lose lazy devs; I would almost certainly guess that PC is an essential crutch for most lazy ST plugin devs. I don't think creating a more structured "process" is a bad thing especially if it saves you some headache.

zchrykng commented 11 years ago

@wbond I completely agree with @jswartwood 's above comment. I have not developed any package's yet, but would be totally willing to jump through a few extra hoops to make your life easier. Between package control and hosting this service, you have given the community so much, the least we can do is make it as easy as possible for you.

sublimator commented 11 years ago

I guess all you really need to reduce the polling is for authors to ping the PC server when there's an update to a package? The client wouldn't deliver the information, just notify that there's new info.

All the same trust is in place then as opposed to having authors need to authenticate themselves beyond getting on to the package_control_channel, which seems to have a back queue in need of a pop() or two.

Lazy devs could use master all the time and just ping when it's ready for the public. And pranksters could release dev versions :)

kizu commented 11 years ago

@wbond :+1: to pinging. There should be also a regular lookup for updates, but it could be once in a day or even two — if there would be a “ping” option (and if it could be pinged using hooks), than it shouldn't be done every hour.

schlamar commented 11 years ago

Depends on @wbond crawler infrastructure, but a separate channel for ST3 might be the simplest solution.

zchrykng commented 11 years ago

@jswartwood What features would be useful in python-semver? And would python-semanticversion work? I am looking at trying to add some features to python-semver, but if the other works, I will stop.

wbond commented 11 years ago

@FichteFoll

I can imagine making the plugin dev able to ommit some keys for packages.json. For example, if you want to proceed with the currect workflow and enable st3 support, you would have to create a file with only the first 4 lines - , + } } from the following example:

{
    "schema_version": "1.3",
    "repositories": {
        "GitHub Example": { "url": "https://github.com/john_smith/github_example", "st_min_version": 3000 },
        "BitBucket Example": {
           "url": "https://bitbucket.org/john_smith/bitbucket_example",
           "st_min_version": 2117, "st_max_version": 2999,
           "platforms": ["windows", "linux"]
        }
    },
    "packages": [
        {
            "name": "Tortoise",
            "description": "Keyboard shortcuts and menu entries to execute TortoiseSVN, TortoiseHg and TortoiseGit commands",
            "author": "Will Bond",
            "homepage": "http://sublime.wbond.net",
            "last_modified": "2011-11-30 22:55:52",
            "platforms": {
                "windows": [
                    {
                        "version": "1.0",
                        "url": "http://sublime.wbond.net/Tortoise.sublime-package"
                    }
                ]   
            }
        }
    ],
    "renamed_packages": {
        "github_example": "Github Example"
    }
}

I originally liked the idea of the cleaned-up repositories, but that totally breaks for real (PC) repositories when they contain more than one package.

Here is my proposal based on the discussion so far:

  1. Create a new schema_version 2.0 for the repository JSON (packages.json)
  2. Convert all of the individual GitHub and BitBucket repositories entries in the channel JSON (repositories.json) into a single new repository JSON file
  3. Use the new repository JSON to perform the package_name_map functionality
  4. Remove package_name_map since it is confusing

Here is what I think the new repository JSON should look like:

{
  "schema_version": "2.0",
  "packages": [
    {
      "name": "SublimeLinter",
      "description": "Inline lint highlighting for the Sublime Text 2 editor",
      "author": "Kronuz, Aparajita Fishman, Jake Swartwood",
      "homepage": "http://github.com/SublimeLinter/SublimeLinter",
      "releases": [
        {
          "platforms": "*",
          "sublime_text": "*",
          "version": "1.6.12",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/v1.6.12",
          "date": "2013-01-09 21:58:08",
          "dependencies": [
            "PyV8",
            "AAAPackageDevelopment",
            "Other Package"
          ]
        },
        {
          "platforms": [ "windows", "linux" ],
          "sublime_text": ">3000",
          "version": "2.0.0",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/v2.0.0",
          "date": "2013-02-09 13:02:21"
        }
      ]
    },
    {
      "name": "Other Package By Tags",
      "details": "https://github.com/wbond/other_package",
      "releases": [
        {
          "sublime_text": "*",
          "details": "https://github.com/wbond/other_package/tags"
        }
      ]
    },
    {
      "name": "Other Package By Master",
      "details": "https://github.com/wbond/other_package",
      "releases": [
        {
          "sublime_text": "*",
          "details": "https://github.com/wbond/other_package/branches/master"
        }
      ]
    },
    {
      "name": "Other Package ST2 Only by Master",
      "details": "https://github.com/wbond/other_package"
    },
    {
      "name": "Other Package ST3 Min Info",
      "description": "Package Description",
      "author": "Will Bond",
      "homepage": "http://github.com/wbond/other_package",
      "releases": [
        {
          "sublime_text": "*",
          "version": "1.5.1",
          "url": "https://nodeload.github.com/wbond/other_package/zip/v1.5.1",
          "date": "2013-01-09 21:58:08"
        }
      ]
    },
    {
      "name": "Other Package ST2 Min Info",
      "description": "Package Description",
      "author": "Will Bond",
      "homepage": "http://github.com/wbond/other_package",
      "releases": [
        {
          "version": "1.5.1",
          "url": "https://nodeload.github.com/wbond/other_package/zip/v1.5.1",
          "date": "2013-01-09 21:58:08"
        }
      ]
    }
  ]
}

So, the idea here it to allow package developers to utilize info from GitHub/BitBucket if they want to, but also allowing non-GitHub/BitBucket users. This is done via the details entries, which are URLs for GitHub/BitBucket that will cause PC to hit the API to gather what data it can.

The absolute minimum for a GitHub/BitBucket package would be:

    {
      "name": "Other Package ST2 Only by Master",
      "details": "https://github.com/wbond/other_package"
    }

This would function the exact same was as an entry in repositories with a package_name_map entry. It would create a release based on the timestamp of master and would only be shown to ST2 users.

The next step up allows pulling some repo info, but specifying the Sublime Text versions it is compatible with:

    {
      "name": "Other Package By Master",
      "details": "https://github.com/wbond/other_package",
      "releases": [
        {
          "sublime_text": "*",
          "details": "https://github.com/wbond/other_package/branches/master"
        }
      ]
    }

To easily provide good semantic versioning, you could base your releases on tags. PC would use the tag name as the version, stripping v off of the front (if present). This could also use the platforms key, but it would not be required.

    {
      "name": "Other Package By Tags",
      "details": "https://github.com/wbond/other_package",
      "releases": [
        {
          "sublime_text": "*",
          "details": "https://github.com/wbond/other_package/tags"
        }
      ]
    }

If the package is not hosted on either GitHub or BitBucket (and thus the APIs can not be used to fetch repository info), the following would be the minimum information needed to list a package for ST3:

    {
      "name": "Other Package ST3 Min Info",
      "description": "Package Description",
      "author": "Will Bond",
      "homepage": "http://github.com/wbond/other_package",
      "releases": [
        {
          "sublime_text": "*",
          "version": "1.5.1",
          "url": "https://nodeload.github.com/wbond/other_package/zip/v1.5.1",
          "date": "2013-01-09 21:58:08"
        }
      ]
    }

If the package was just for ST2, the sublime_text key could be omitted.

Finally, there is the dependencies key I stuck in there. I had originally included versions, but I am going to punt on that for now. In ST we don't have the ability to run plugins in different sandboxes, so whatever version is available will just have to work.


Since the JSON format will be changing with PC 2.0, I am planning on creating a new URL for the JSON, which will likely be:

https://sublime.wbond.net/default_channel

The new repository holding the package info from all of the GitHub/BitBucket entries from repositories would be served from:

https://sublime.wbond.net/community_repository

By adding these URLs, but leaving the existing ones also, this will allow launching PC 2.0, while keeping PC 1.x still running so that users can automatically upgrade.

I know this is a lot to process, but I'd like to get moving on this now that PC seems stable on ST3 and ST2.

schlamar commented 11 years ago

Looks good to me. However, you should at least support a minimum version in dependencies, otherwise upgrades could fail (e.g. plugin X v1.0 requires plugin Y v0.2 and X v2.0 requires Y v0.3, so updating X will require updating Y as well). It still could fail (e.g. if plugin Z relies on Y v0.2 and Y is not backwards compatible). However, this is the same approach than pip is handling the version dilemma.

schlamar commented 11 years ago

And if you are going to support dependencies, you might want to consider supporting packages from PyPI, too. (Not sure how this could be done consistently, though)

sublimator commented 11 years ago

Yeah, it would be nice to be able to detect dependency version conflicts, even if nothing can realistically be done about it beyond reporting to the user and/or auto disabling certain packages.

FichteFoll commented 11 years ago

I don't even know what package depends on another one, though I'd definitely like being able to add and specify dependencies (see sublime_lib). This has several versioning problems and should be discussed beforehand. I think PC could handle cases where an update would "break" because of a dependency but when could this possibly happen? I mean, if you require another package you should inform the user in the first place. And PC will be used to download dependencies so it should be up to date anyway and there is no point in pushing a new version that requires a yet-to-be-released dependency. Still, this dependeny could be updated and break the package the other way around.

However, I think this discussion belongs to #166.


Regarding the other things: Looks good to me. I initially wanted to separate the simple and the complex package constructs with "repositories" and "packages" but having everything in one struct works quite well. And I certainly like the tags feature because that woudn't require me to update my JSON repo file for every release. The "details" key was irritating at first, I expected it to be something like "repo" but this terminology is a) not commonly used (it is just wrong) and b) "details" offers much more flexibility in case there are going to be added other possible details-defining things I can't even think of right now.

For other cross-references of other channels a "channels" key should be added and used in package_control_channel so that all the json files aren't put together with the repos. Furthermore, you could include other channels from a channel and so on. And finally the "renamed_packages" key must be taken care of. Either keep as it is or add a "previous_names": [] key to the packages to make stuff even more centralized.

schlamar commented 11 years ago

add a "previous_names": [] key to the packages to make stuff even more centralized.

:+1: This would finally result in a clean channel structure easy to maintain (just a collection of *.json locations).

schlamar commented 11 years ago

@wbond What would be the terminology to specify a ST3 only package (or different versions for ST2 and ST3)?

FichteFoll commented 11 years ago

@schlamar The "sublime_text" key can be ommitted for ST2-only, but I guess "sublime_text": "<3000" would also be correct here.

{
  "schema_version": "2.0",
  "packages": [
    {
      "name": "SublimeLinter",
      "description": "Inline lint highlighting for the Sublime Text 2 editor",
      "author": "Kronuz, Aparajita Fishman, Jake Swartwood",
      "homepage": "http://github.com/SublimeLinter/SublimeLinter",
      "releases": [
        {
          "version": "1.6.12",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/v1.6.12",
          "date": "2013-01-09 21:58:08"
        },
        {
          "sublime_text": ">3000",
          "version": "2.0.0",
          "url": "https://nodeload.github.com/SublimeLinter/SublimeLinter/zip/v2.0.0",
          "date": "2013-02-09 13:02:21"
        }
      ]
    }
  ]
}