Closed brandonros closed 6 years ago
Hi, the idea of this library is to bridge V8 calls for deno. Currently the prototype is written in Go, this repository is a personal experiment to make it compatible with Rust and do some experiments and benchmarks.
What real world use does it have? Why would you ever want to execute snippets of Javascript in another language? This isn’t bindings to call Rust from Javascript, is it?
Check the project description in the deno repo, some users think it might work as a NodeJS alternative. The JS runs on V8 and Go (or Rust) handle the internal system/network/os calls. This binding won't let you call Rust directly but only send specific Protocol Buffers messages.
I checked out the deno repo and it looks promising, but I'm still struggling to understand the concept for this package.
I understand that without the node.js counterpart, the v8 engine is just... a giant REPL for Javascript? No filesystem access, no networking, etc.
With these Rust bindings, what's the goal? To be able to make network/filesystem calls by passing Protocol Buffer messages? Could you show an example?
That's right, V8 is just a giant REPL in that case. For example I implemented a "net" module for Deno using Go here: https://github.com/ry/deno/pull/229 It would be possible to keep all the Typescript code and switch to another backend language, like Rust. This is repository a very minimal one that wraps access to V8 and exposes only two methods (recv and worker_send). I'm working on a separate repo that's an experimental rewrite of Deno itself, will share it in the next days. The entrypoint file looks like this:
fn main() {
let mut handler = v8worker2::new_handler();
let receiver = handler.receiver.clone();
handler.init();
let worker = handler.add_worker(0);
thread::spawn(move || {
for msg in receiver {
cb(&worker, m);
}
});
start_deno(worker);
start_deno
will load all the JS/TS file, there's a process in the original deno
repository that merges everything into a single file:
fn start(_worker: &mut worker::Worker) {
// Load main.js:
let main_js_filename = String::from("main.js");
let mut main_js = File::open(main_js_filename).expect("file not found");
let mut main_js_contents = String::new();
main_js.read_to_string(&mut main_js_contents).expect("i/o error");
_worker.load("/main.js".to_string(), main_js_contents.clone());
// Call denoMain
_worker.load("/deno_main.js".to_string(), "denoMain();".to_string());
// Load main.map:
let main_map_filename = String::from("main.map");
let mut main_map = File::open(main_map_filename).expect("file not found");
let mut main_map_contents = String::new();
main_map.read_to_string(&mut main_map_contents).expect("i/o error");
// Get current dir
let cwd = env::current_dir().unwrap();
let cwd_str = cwd.into_os_string().into_string().unwrap();
// Prepare start message:
let mut _message: msg::Msg;
_message = msg::Msg::new();
_message.command = msg::Msg_Command::START;
_message.start_cwd = cwd_str;
_message.start_debug_flag = true;
_message.start_main_js = main_js_contents;
_message.start_main_map = main_map_contents;
_message.start_argv.push("readfile.ts".to_string());
pub_msg(_worker, "start", &_message);
}
pub_msg
is responsible for marshaling and sending these messages to the V8 worker (using the rust-v8worker2
implementation):
pub fn pub_msg(_worker: &mut v8worker2::worker::Worker, _channel: &String, _message: &msg::Msg) {
let _serialized = _message.write_to_bytes().unwrap();
let mut _base_msg: msg::BaseMsg;
_base_msg = msg::BaseMsg::new();
_base_msg.channel = "start".to_string();
_base_msg.payload = _serialized;
let _serialized_base_msg = _base_msg.write_to_bytes().unwrap();
_worker.send_bytes(&_serialized_base_msg);
}
At the same time in main
we spawn a thread that listens for messages. I'm currently refactoring this to use channels but the way messages land into the program is through a Rust callback that's exposed to C and called by the V8 code, this callback uses a channel sender. From this thread we keep reading from the channel receiver and route those messages to the sub
function:
// TODO: reference worker somehow
pub fn sub(_message: &msg::BaseMsg) {
let _inner_msg: msg::Msg;
let _decoded = protobuf::parse_from_bytes::<msg::Msg>(&_message.payload);
_inner_msg = _decoded.unwrap();
match _inner_msg.command {
msg::Msg_Command::CODE_FETCH => {
// Implement CODE_FETCH
},
_ => (),
};
}
So, this is really cool, but I'm wondering what the benefit is?
If all of the JS has to be parsed by the V8 engine, isn't it safe to assume that approximately 95% of all performance/code execution is there, and things like the net
library are just thin wrappers?
The most common complaints I've heard about node.js is that... it's Javascript. Not that it's a C++ library on top of V8 and people wish it was Go or Rust, so I'm trying to quantify how it will be "better" than Node.js, or am I missing the point and it is just to be different/prove it is possible? Still super cool though.
I did some benchmarks with the Go implementation and most of the CPU time is actually spent in passing data structures between V8 and Go, bridging these two worlds have a significant overhead specially in Go (cgo
is very slow and kills the idea of light go routines). This might become a reason to look for alternatives languages.
The benefit might be a faster and safer alternative (not necessarily 100% compatible!) to NodeJS or similar technologies. The "safer" keyword is very interesting too, from my personal experience I tried to embed NodeJS once and it was really hard , also worked on a project that integrates CPython and found out that this project doesn't really provide sandboxes for running code (V8 it's much better in this sense, with a strong focus on embedding capabilities), think PyPy is working on this but last time I checked it wasn't ready yet . So I'm pretty sure deno
will be interesting for similar use cases.
Also V8 might be replaced at some point, perhaps it ends up like this?
You may find more reasons here.
Note that I'm not a core developer or anything, I'm just experimenting with these ideas because I enjoy working on interoperability and researching how these technologies work.
What can be built/exists today where there is a need to call just Javascript computation (without any of the supporting node.js system-interacting libraries) from another language like Rust/Go?