grpc / grpc

The C based gRPC (C++, Python, Ruby, Objective-C, PHP, C#)
https://grpc.io
Apache License 2.0
41.69k stars 10.51k forks source link

Consider a "node-grpc" Ubuntu package? #4128

Closed jgeewax closed 8 years ago

jgeewax commented 8 years ago

Related to #1165

This would make the Node installation process look like:

$ sudo apt-add-repository <some ppa>  # If we make this line go away that's a bonus.
$ sudo apt-get install node-grpc
$ node
> var grpc = require('grpc');
murgatroid99 commented 8 years ago

This wouldn't really make sense. Node packages are locally installed relative to the package using them, and there's no real gain to pre-installing it globally. Plus, our Node module is self-contained: it has no pre-install dependencies, unlike all of the other C-based languages.

Installing the Node module is already simpler than the steps you gave:

$ npm install grpc
$ node
> var grpc = require('grpc');
jgeewax commented 8 years ago

our Node module is self-contained

$ npm install grpc
$ node
var grpc = require('grpc');

I just tried this on a fresh VM running Ubuntu 15.10 and this failed, so I assume there are packages missing....

$ npm install -g grpc
|
> grpc@0.11.1 install /usr/local/lib/node_modules/grpc
> node-gyp rebuild

make: Entering directory '/usr/local/lib/node_modules/grpc/build'
  CXX(target) Release/obj.target/grpc/ext/byte_buffer.o
  CXX(target) Release/obj.target/grpc/ext/call.o
../ext/call.cc: In static member function ‘static Nan::NAN_METHOD_RETURN_TYPE grpc::node::Call::New(Nan::NAN_METHOD_ARGS_TYPE)’:
../ext/call.cc:562:36: error: ‘GRPC_PROPAGATE_DEFAULTS’ was not declared in this scope
       gpr_uint32 propagate_flags = GRPC_PROPAGATE_DEFAULTS;
                                    ^
../ext/call.cc:583:67: error: cannot convert ‘grpc_call*’ to ‘grpc_completion_queue*’ for argument ‘2’ to ‘grpc_call* grpc_channel_create_call(grpc_channel*, grpc_completion_queue*, const char*, const char*, gpr_timespec)’
             *host_override, MillisecondsToTimespec(deadline), NULL);
                                                                   ^
In file included from ../ext/call.cc:41:0:
/usr/include/grpc/grpc.h:61:16: note: class type ‘grpc_call’ is incomplete
 typedef struct grpc_call grpc_call;
                ^
../ext/call.cc:588:57: error: cannot convert ‘grpc_call*’ to ‘grpc_completion_queue*’ for argument ‘2’ to ‘grpc_call* grpc_channel_create_call(grpc_channel*, grpc_completion_queue*, const char*, const char*, gpr_timespec)’
             NULL, MillisecondsToTimespec(deadline), NULL);
                                                         ^
In file included from ../ext/call.cc:41:0:
/usr/include/grpc/grpc.h:61:16: note: class type ‘grpc_call’ is incomplete
 typedef struct grpc_call grpc_call;
                ^
../ext/call.cc: In static member function ‘static Nan::NAN_METHOD_RETURN_TYPE grpc::node::Call::StartBatch(Nan::NAN_METHOD_ARGS_TYPE)’:
../ext/call.cc:640:12: error: ‘__gnu_cxx::__alloc_traits<std::allocator<grpc_op> >::value_type {aka struct grpc_op}’ has no member named ‘reserved’
     ops[i].reserved = NULL;
            ^
../ext/call.cc:677:58: error: too many arguments to function ‘grpc_call_error grpc_call_start_batch(grpc_call*, const grpc_op*, size_t, void*)’
           callback, op_vector.release(), resources), NULL);
                                                          ^
In file included from ../ext/call.cc:41:0:
/usr/include/grpc/grpc.h:420:17: note: declared here
 grpc_call_error grpc_call_start_batch(grpc_call *call, const grpc_op *ops,
                 ^
../ext/call.cc: In static member function ‘static Nan::NAN_METHOD_RETURN_TYPE grpc::node::Call::Cancel(Nan::NAN_METHOD_ARGS_TYPE)’:
../ext/call.cc:689:68: error: too many arguments to function ‘grpc_call_error grpc_call_cancel(grpc_call*)’
   grpc_call_error error = grpc_call_cancel(call->wrapped_call, NULL);
                                                                    ^
In file included from ../ext/call.cc:41:0:
/usr/include/grpc/grpc.h:448:17: note: declared here
 grpc_call_error grpc_call_cancel(grpc_call *call);
                 ^
../ext/call.cc: In static member function ‘static Nan::NAN_METHOD_RETURN_TYPE grpc::node::Call::CancelWithStatus(Nan::NAN_METHOD_ARGS_TYPE)’:
../ext/call.cc:712:72: error: too many arguments to function ‘grpc_call_error grpc_call_cancel_with_status(grpc_call*, grpc_status_code, const char*)’
   grpc_call_cancel_with_status(call->wrapped_call, code, *details, NULL);
                                                                        ^
In file included from ../ext/call.cc:41:0:
/usr/include/grpc/grpc.h:456:17: note: declared here
 grpc_call_error grpc_call_cancel_with_status(grpc_call *call,
                 ^
../ext/call.cc: In static member function ‘static Nan::NAN_METHOD_RETURN_TYPE grpc::node::Call::GetPeer(Nan::NAN_METHOD_ARGS_TYPE)’:
../ext/call.cc:721:53: error: ‘grpc_call_get_peer’ was not declared in this scope
   char *peer = grpc_call_get_peer(call->wrapped_call);
                                                     ^
grpc.target.mk:106: recipe for target 'Release/obj.target/grpc/ext/call.o' failed
make: *** [Release/obj.target/grpc/ext/call.o] Error 1
make: Leaving directory '/usr/local/lib/node_modules/grpc/build'

(I'll open a separate issue for that).

This wouldn't really make sense.

If I just want to be able to use a library that happens to use gRPC via Node, it'd be nice to have a single-line install (even if that means it's global for the system) -- so if the single line is npm install -g grpc then... I'm happy. My attempt at that just failed though...

If it doesn't make sense, can you help me understand why all these other packages exist ? I thought the idea was that sometimes packages depend on others, which would explain why these exist:

$ apt-cache search node | grep node
... snip ...
node-sqlite3 - asynchronous, non-blocking SQLite3 bindings for Node.js
node-srs - spatial reference library for Node.js
node-static - RFC2616 compliant HTTP static-file server module with caching
node-step - simple control-flow library for Node
node-strip-json-comments - Node.js module to strip comments from JSON
node-stylus - Robust, expressive, and feature-rich CSS superset - Node.js module
node-superagent - HTTP client request with chainable API - Node.js module
node-supertest - superagent driven library for testing HTTP servers
node-tap - Test-Anything-Protocol module for Node.js
node-tar - read and write portable tar archives module for Node.js
node-temp - Temporary files, directories, and streams for Node.js
node-through2 - Make a stream.Transform out of a function - Node.js module
node-tilejson - tile source backend for online tile sources
node-tilelive - Interface for tile backends modules for Node.js
node-tinycolor - No-fuzz, barebone, zero muppetry color module for Node.js
node-tmp - Temporary file and directory creator for Node.js
node-topcube - spawn a child webkit window from Node.js
node-transformers - String and data transformations using templates and compilers
node-traverse - recursively traverse objects in Node.js
node-tunnel-agent - HTTP proxy tunneling agent module for Node.js
node-type-is - infer the content type from request
node-typedarray-to-buffer - JavaScript utility converting TypedArray to buffer without copy
node-typescript - superset of JavaScript that compiles to clean JavaScript output
node-uglify - JavaScript parser, mangler/compressor and beautifier toolkit
node-underscore - JavaScript's functional programming helper library - NodeJS
node-underscore.logger - cross-browser and Node empowered logging - Node module
node-unorm - Common JS Unicode Normalizer (Node.js)
node-util - NodeJS/JavaScript util module
node-utilities - classic collection of JavaScript utilities
node-utils-merge - provides a merge utility function
node-validator - Javascript string validation and sanitization for Node.js
node-vary - manage the Vary header of a HTTP response - Node.js module
node-vhost - connect middleware for domain request matching - Node.js module
node-vows - asynchronous BDD & continuous integration for Node
node-webfinger - Client library for Host Meta (RFC 6415) and Webfinger
node-websocket - WebSocket implementation for NodeJS
node-websocket-driver - WebSocket protocol handler with pluggable I/O for Node.js
node-which - Cross-platform 'which' module for Node.js
node-with - compile-time `with` statement - Node.js module
node-wordwrap - word wrapping library for NodeJS
node-ws - RFC-6455 WebSocket implementation module for Node.js
node-xml2js - simple XML to JavaScript object converter - Node.js module
node-xmlhttprequest - XMLHttpRequest for Node
node-yajsml - Yet another (Common)JS module loader
node-yamlish - Parser/encoder for the YAMLish format for Node.js
... snip ...
murgatroid99 commented 8 years ago

I just realized, our Node module does in fact currently depend on the gRPC C core being installed. What I was describing is actually in the next release. At that point, npm install grpc will be the single line install.

I don't actually know why those other packages exist. In Node, package dependencies are stored in the package.json file, and are resolved at installation time. I see no benefit in also having Ubuntu packages.

jgeewax commented 8 years ago

I think maybe we should figure this one out before deciding not to join the club...

/cc @stephenplusplus @callmehiphop -- any idea for why Ubuntu+Node packages exist ? Is it the same reason why python-crypto exists (pip install PyCrypto fails without lots of -dev packages installed) ?

stephenplusplus commented 8 years ago

I'm not too familiar with that pattern, so a quick ping to @passy for any insight on package manager crossover he might have.

passy commented 8 years ago

So there's definitely some version conflict between the wrapper and the underlying library in your output:

../ext/call.cc:689:68: error: too many arguments to function ‘grpc_call_error grpc_call_cancel(grpc_call*)’

But to the actual question, those packages exist because from the perspective of a system administrator or ops person there shouldn't be any different between something like tmux and yo. Whether the tool is written in C or with Node really shouldn't matter.

An Ubuntu LTS release comes with the guarantee that all packages included in it are stable and tested to work well together (at least to some degree) whereas npm gives you the guarantee that you get the bleeding edge. There's no right or wrong choice, it's just a question of requirements.

I hope this helps. :)

murgatroid99 commented 8 years ago

I don't think it's quite accurate to say that "npm gives you the guarantee that you get the bleeding edge." npm gives you the version that you ask for. Node packages typically depend on package versions compatible with a given version, and npm guarantees that you get the newest version in that range.

And the point about tools makes sense, but that only really applies to command-line tools. Several of those packages look like libraries, where a global install is considerably less useful. It could be because they have build-time dependencies on non-npm packages, but some of them, like underscore, are JavaScript-only.

passy commented 8 years ago

I don't think it's quite accurate to say that "npm gives you the guarantee that you get the bleeding edge." npm gives you the version that you ask for. Node packages typically depend on package versions compatible with a given version, and npm guarantees that you get the newest version in that range.

Yes, that's right. If everyone adheres to semver, that's correct. With an LTS release, however, you don't need to trust the individual authors to follow the rules because once the deps are frozen they stay like this. Like a shrinkwrap for everything.

I agree that stand-alone tools are one of the most obvious places where this is useful, but if you wanted to run something like Ghost on an Ubuntu box this can just be as helpful - and I've used the same paradigm for deploying Python applications where all dependencies came through dpkg and just the app itself was not.

murgatroid99 commented 8 years ago

You may be interested to know that npm has a shrinkwrap command that freezes dependencies exactly as you describe.

I may be missing something here, because I still don't understand what the benefit is of pre-installing pure JS dependencies globally. When you run npm install, it recursively downloads dependencies into relative directories. If you have globally installed libraries that you want to use, you have to instead run npm install --link to explicitly indicate that any compatible globally installed packages should be used.

passy commented 8 years ago

You may be interested to know that npm has a shrinkwrap command that freezes dependencies exactly as you describe.

Check my last sentence of the first paragraph. :)

The packages, at least on Ubuntu, install into global flattened namespace at /usr/lib/nodejs/. The main reason here is really to avoid having another package manager interfere with your packages if you want to install them globally. On a managed Linux box nothing should ever touch anything outside of /home that isn't the One Package Manager. This won't work for everyone and that's totally fine. npm works well for most end users and in a bunch of deployment scenarios. Some people prefer managing their dependencies through the system package manager and that's okay, too. Having those packages only helps the adoption of Node in those environments.

murgatroid99 commented 8 years ago

The main reason here is really to avoid having another package manager interfere with your packages if you want to install them globally

This is the part I'm still not quite getting: why would you want to install an npm library globally in the first place?

There's also another more technical problem with doing this for gRPC specifically: as mentioned above, we plan to have a single install step (npm install) for the Node package. This requires us to distribute the C code with the Node code. But we already have a separate package with just the C code. In Debian, at least, we are apparently not allowed to distribute the same code in two different packages. I expect Ubuntu has the same rule, so we will have to choose between this package, and a simple install with npm.

passy commented 8 years ago

The globally doesn't matter if you have a container or even if you have a full server with a single app on it. There's effectively no difference between a local or a global install at that point.

You have some valid arguments against distributing it that way and I don't think it's a must. People who deploy software this way are probably used to building their own packages for this purpose as they will never have all the required packages. It's a matter of taste and practices that are already in place. If you work in a shop that has deployed software like this for ages it's difficult to make a case for something like npm.

ctiller commented 8 years ago

Agree we should leave this possibility open. I don't think it'll be the common nor recommended path, but it does allow folks to keep their existing practices in some cases - and opens the door for other packages to be written in node and packaged for debian to depend on us.

murgatroid99 commented 8 years ago

I don't think this needs to be a GA requirement. I have seen nobody else asking for this.

jgeewax commented 8 years ago

I agree. npm install grpc now "just works" on most standard systems (ie, Ubuntu), so I'm unsure whether we need this at all....

jtattermusch commented 8 years ago

We certainly don't have any plans to maintain such package in the near future. Closing.