Open gordonwoodhull opened 9 years ago
(motivation is that merging or overwriting artifacts is extremely annoying to developers)
+1
Proposed solution. @s-u, @prateek05, is this acceptable?
npm
another prerequisite (currently we only require it if you're "hacking on the code" but since we require R
and a few other things, seems reasonable enough to require node.package.json
, move it to root, and make it somewhat descriptive of RCloud (even though we still won't actually publish RCloud to npm, of course)npm install
in root directory when bootstrapR.sh
is runnode_modules
I'm somewhat hesitant making nodejs
+ npm
a requirement for everyone since it's not directly available, e.g. on OS X. I think we should really keep the distinction of hacking on it - where it's ok to require a lot more and work off the bare git checkout - and installing it which doesn't need either. So if we just provide automated tar balls (like we do for packages on RForge.net) which contain the artifacts then I think it would be ok.
Ah - so you're suggesting distributing through RForge rather than GitHub?
I don't think GH has a way to build distribution tar balls (since you don't have a VM to install things on). I didn't check, though, if there is a way to "push" distribution tar balls that have been built elsewhere - if it does, then we could do that, otherwise either our RCloud website or RForge.net would be ok ...
Okay, now I've lost interest because this is stuff I don't now how to do.
I think we already have what need - just pack up the result after build.sh
...
GitHub does have a way to push distribution tar balls - it's called GitHub Releases.
https://help.github.com/articles/creating-releases/
Basically each tag is automatically a Release and then you have the option to annotate it further with release notes and binaries.
So I think all we would need is a script that runs build.sh and then includes what we want to include in a tarball.
I guess what I was looking for was
https://developer.github.com/v3/repos/releases/#upload-a-release-asset
so we need script that packs up the release and then pushes the asset for it. That makes sense - I may tackle this anyway for RForge.net to push package tar balls as the GH automated way is only wreaking havoc.
Ah yes, even better if it's automated!
Assigning to @s-u for creating a script to pack up the right resources, delaying to next release so that we have a release cycle to deal with consequences if any.
@s-u, Is this the same thing we need in order to host rcloud.support on RForge so that packages that depend on it like rcleaflet can be built?
Maybe in 2.0 :laughing:
Hilarious. I am creating milestone 2.2 to put this in. Again, we need to think about this at the beginning of a milestone, not the end.
Trouble is, at the beginning of each milestone we're backlogged on everything else we weren't doing because of the last one.
a.k.a. "RCloud as a package"
I'm making this my priority for the next release. This opens up using ocaps in command-line R and RStudio, and conceivably Shiny and Jupyter too — if we can run the Rserve message loop inside of their web servers the way we are running the Shiny message loop in Rserve's idle callback.
Can you clarify? The original post was about repo, but there is also the thing we used in Acumos examples which is an R package that allows you to run deployed RCloud (rcloud.runtime was it?), but those are separate in a way - you can have a clean release without it being a package. But creating the package is easier if we have a clean separation of "sources" and deployed artefacts...
Yes, you’re right, this issue is a prereq for “RCloud as a package” but we don’t have an issue for that despite plenty of discussion.
I want RCloud as a package because it makes it possible to use ocaps in command-line R and RStudio. So I’m going to work on this issue.
I never saw the Acumos package but I think we are talking about the same thing.
I’d also like to do the other stuff marked for 2.3 (reconnect and compute nodes) but I’m not sure how much time I will have, so I’m making this my priority.
there is a lot of build that has to happen when someone installs rcloud from the source.
the only thing folks are avoiding by having the artifacts checked in, is installing npm in two directories.
couldn't we automate the install of npm and avoid this? or since we're only using uglify in order to validate, use some other linting tool that doesn't require npm?