att / rcloud

Collaborative data analysis and visualization
http://rcloud.social
MIT License
431 stars 142 forks source link

build release tarballs with artifacts, delete them from repo #1728

Open gordonwoodhull opened 9 years ago

gordonwoodhull commented 9 years ago

there is a lot of build that has to happen when someone installs rcloud from the source.

the only thing folks are avoiding by having the artifacts checked in, is installing npm in two directories.

couldn't we automate the install of npm and avoid this? or since we're only using uglify in order to validate, use some other linting tool that doesn't require npm?

gordonwoodhull commented 9 years ago

(motivation is that merging or overwriting artifacts is extremely annoying to developers)

s-u commented 9 years ago

+1

gordonwoodhull commented 8 years ago

Proposed solution. @s-u, @prateek05, is this acceptable?

  1. make npm another prerequisite (currently we only require it if you're "hacking on the code" but since we require R and a few other things, seems reasonable enough to require node.
  2. de-duplicate package.json, move it to root, and make it somewhat descriptive of RCloud (even though we still won't actually publish RCloud to npm, of course)
  3. do npm install in root directory when bootstrapR.sh is run
  4. add a dependency to a plain javascript parser like acorn or esprima, for validation with decent error messages
  5. validate before uglifying, pointing to root directory's node_modules
  6. consider disabling uglify.
s-u commented 8 years ago

I'm somewhat hesitant making nodejs + npm a requirement for everyone since it's not directly available, e.g. on OS X. I think we should really keep the distinction of hacking on it - where it's ok to require a lot more and work off the bare git checkout - and installing it which doesn't need either. So if we just provide automated tar balls (like we do for packages on RForge.net) which contain the artifacts then I think it would be ok.

gordonwoodhull commented 8 years ago

Ah - so you're suggesting distributing through RForge rather than GitHub?

s-u commented 8 years ago

I don't think GH has a way to build distribution tar balls (since you don't have a VM to install things on). I didn't check, though, if there is a way to "push" distribution tar balls that have been built elsewhere - if it does, then we could do that, otherwise either our RCloud website or RForge.net would be ok ...

gordonwoodhull commented 8 years ago

Okay, now I've lost interest because this is stuff I don't now how to do.

s-u commented 8 years ago

I think we already have what need - just pack up the result after build.sh ...

gordonwoodhull commented 8 years ago

GitHub does have a way to push distribution tar balls - it's called GitHub Releases.

https://help.github.com/articles/creating-releases/

Basically each tag is automatically a Release and then you have the option to annotate it further with release notes and binaries.

So I think all we would need is a script that runs build.sh and then includes what we want to include in a tarball.

s-u commented 8 years ago

I guess what I was looking for was

https://developer.github.com/v3/repos/releases/#upload-a-release-asset

so we need script that packs up the release and then pushes the asset for it. That makes sense - I may tackle this anyway for RForge.net to push package tar balls as the GH automated way is only wreaking havoc.

gordonwoodhull commented 8 years ago

Ah yes, even better if it's automated!

gordonwoodhull commented 8 years ago

Assigning to @s-u for creating a script to pack up the right resources, delaying to next release so that we have a release cycle to deal with consequences if any.

gordonwoodhull commented 8 years ago

@s-u, Is this the same thing we need in order to host rcloud.support on RForge so that packages that depend on it like rcleaflet can be built?

gordonwoodhull commented 6 years ago

Maybe in 2.0 :laughing:

gordonwoodhull commented 5 years ago

Hilarious. I am creating milestone 2.2 to put this in. Again, we need to think about this at the beginning of a milestone, not the end.

Trouble is, at the beginning of each milestone we're backlogged on everything else we weren't doing because of the last one.

gordonwoodhull commented 3 years ago

a.k.a. "RCloud as a package"

I'm making this my priority for the next release. This opens up using ocaps in command-line R and RStudio, and conceivably Shiny and Jupyter too — if we can run the Rserve message loop inside of their web servers the way we are running the Shiny message loop in Rserve's idle callback.

s-u commented 3 years ago

Can you clarify? The original post was about repo, but there is also the thing we used in Acumos examples which is an R package that allows you to run deployed RCloud (rcloud.runtime was it?), but those are separate in a way - you can have a clean release without it being a package. But creating the package is easier if we have a clean separation of "sources" and deployed artefacts...

gordonwoodhull commented 3 years ago

Yes, you’re right, this issue is a prereq for “RCloud as a package” but we don’t have an issue for that despite plenty of discussion.

I want RCloud as a package because it makes it possible to use ocaps in command-line R and RStudio. So I’m going to work on this issue.

I never saw the Acumos package but I think we are talking about the same thing.

I’d also like to do the other stuff marked for 2.3 (reconnect and compute nodes) but I’m not sure how much time I will have, so I’m making this my priority.