Open deizel opened 9 years ago
There is in fact a plan to automate this using my team's test server, but it's currently held up by #3. If we make changes to the scraping process and push our changes to this repo, then any place where that scraping process is already automated (say via cron) will essentially break since its upstream remote will have changed. It will be trying to push new release/
s without having first pulled those upstream changes we added. Think of this like an "automatic update" function that is currently missing.
loadsys/puphpet-release-composer-installer
that does most of bin/scrape-release.sh
"on the fly" directly when you call composer require
. That would be extremely un-composer-like, of course.)The puphpet/puphet project changes so rapidly that this is bound to be an ongoing issue. That's why this project is designed to tag new minor versions for every scraping run-- the idea is that consuming project must manually verify that 1.57.0
still works for them when they've been stable on 1.23.0
. (We're using minor semver versions to represent the fact that upstream puphpet changes are frequently and unpredictably backwards-incompatible.)
I apologize for forcing you to resort to using a fork, but until #3 is implemented and tested, I'm afraid that's probably going to be your best bet. :sweat:
As suggested in #9; when automatic updates are eventually online, we should update the README to not target a specific version (since composer will apparently "do the right thing" with the versions for us.)
README.md:
-$ composer require --dev loadsys/puphpet-release:~0.0.4
+$ composer require --dev loadsys/puphpet-release
If you use a fresh
config.yaml
from the PuPHPet website, the VM doesn't start onvagrant up
.Seems this repo has fallen out-of-sync with the main website and it's probably time to update it again.
I've created a new release on my fork so that I can use that for the time being:
Is there a plan to automate this, or do you think #8 is a better option?