Closed facekapow closed 8 years ago
I was able to create a working TCP echo server using this branch:
'use strict';
const runtime = require('runtimejs');
const net = require('net');
const server = net.createServer((socket) => {
console.log('client...');
socket.on('end', () => console.log('client end...'));
socket.write('hello...\r\n');
socket.pipe(socket);
});
server.listen(9000, () => console.log('echo server online...'));
This is awesome! I'm looking through code and going to leave my comments.
1,2. There is a https://www.npmjs.com/package/readable-stream package, that's basically node streams in userland (stream-browserify
uses it internally (https://github.com/substack/stream-browserify/blob/master/index.js#L28-L32), maybe we could use that instead? I think implementation of streams is ok, but it's going to be much harder to make sure it works the same way as in node.
extends
construct. Something like Object.assign for prototype?I tried replacing ./modules/stream.js
with readable-stream
(and of course, I installed the package), which gave me another error:
Uncaught exception: /node_modules/runtimejs/modules/inherits.js:24: TypeError: Object prototype may only be an Object or null: undefined
TypeError: Object prototype may only be an Object or null: undefined
at Object.inherits (/node_modules/runtimejs/modules/inherits.js:24:27)
at /node_modules/runtimejs/node_modules/readable-stream/lib/_stream_readable.js:62:6
at /node_modules/runtimejs/node_modules/readable-stream/lib/_stream_readable.js:976:3
at Module.require (__loader:73:9)
at /node_modules/runtimejs/node_modules/readable-stream/readable.js:13:48
at /node_modules/runtimejs/node_modules/readable-stream/readable.js:6:28
at /node_modules/runtimejs/node_modules/readable-stream/readable.js:13:3
at Module.require (__loader:73:9)
at Loader.require (__loader:247:25)
at __loader:290:25
So, for now, I'm going to stick with my implementation and just work on it to support the Node API. I've removed the Promises from the API and just left the callbacks.
@iefserge Thanks for readable-stream
, the problem I was having was that directly replacing it in the __loader
was causing errors.
@facekapow ah, you have already seen that ;)
My PR might be broken. Managed to get readable-stream kinda working, there is an issue with the echo tcp server. It prints hello...
successfully but doesn't echo anything after that. At the same time it works with your implementation perfectly, do you think it might be net-related issue?
I just feel like node streams are so complex and there are so many versions of them, so it's going to be a nightmare to support own implementation. wdyt?
Yeah, I just noticed the problem, when I tested it after committing... :{ I'm going to have to roll back that commit for now. Honestly, I think our implementation would be fine. Any issues that popup would just be fixed. I don't think it'd be such a nightmare...
It's possible it might be net related (since _read
does nothing, that might be it)...
@facekapow one thing I noticed is that console.log prints to the tty instead of the console. default stdio should probably print to the console too.
@facekapow I think it would be fine to split http into separate PR, so we can merge this first faster
Sorry it took me so long to update, but I finally removed http.js
.
I'm currently working on implementing the VM module.
@iefserge I keep getting error: 'v8::Local<T>::Local(S*) [with S = void; T = v8::Context]' is private
when trying to compile my code in native-object.cc
. My code:
v8::Local<v8::Value> ctx_tmp;
auto maybe_val = arg1->ToObject()->GetPrivate(context, kCtx);
maybe_val.ToLocal(&ctx_tmp);
v8::Local<v8::External> ext = ctx_tmp.As<v8::External>();
v8::Local<v8::Context> ctx = static_cast(v8::Local<v8::Context>)(ext->Value()); // I keep getting the error here
// ext->Value() by itself is okay, it must be something to do with the v8::Local<v8::Context>
Any idea why this is happening? I've tried everything.
@facekapow hmm, let's chat on gitter
OK
I think these are all the modules I'm gonna add for now. @iefserge If you want to add anymore, let me know. Otherwise, can this merge?
Apart from making process.stdout
and process.stderr
output to the QEMU console, I also added process.termout
and process.termerr
as overrides that go straight to the graphical console.
Apart from making process.stdout and process.stderr output to the QEMU console, I also added process.termout and process.termerr as overrides that go straight to the graphical console.
Could you be able to explain this a bit more?
@facekapow looks like console.log outputs extra \n
@piranna there are two consoles
@iefserge About the eshttp
module you mentioned earlier, I think a wrapper around it would make a pretty good http
module :+1:.
How is that they are not the same? El 25/3/2016 22:54, "Serge" notifications@github.com escribió:
@piranna https://github.com/piranna there are two consoles
- console that QEMU outputs to the terminal (works through serial port) - console.log prints here by default
- graphical console in QEMU window
— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub https://github.com/runtimejs/runtime/pull/93#issuecomment-201533227
@piranna The QEMU serial port console is the one that outputs to your system terminal. The graphical console is the one created by runtime.js in the QEMU window.
@piranna The QEMU serial port console is the one that outputs to your system terminal. The graphical console is the one created by runtime.js in the QEMU window.
I understand the difference, but I don't understand why there are different... Maybe I'm too much used to deal with QEmu and Linux where you get one or the other...
graphical console has a command prompt, useful for debugging.
There's only one little thing. Should I add a runtime.console
? An instance of Console
with its output hooked up to the graphical console? Or just leave everything the way it is?
Two consoles would be pretty confusing, I think graphical one is more like a debug tool. Should be fine as it is.
Ok.
Any last minute bugs or anything?
@facekapow this looks great, very exciting! Tests coverage would be cool, but there are still a few issues with the test runner, so I'd say it's fine to merge now :+1:
I'll try to figure out tests (and linting) later at some point.
Bump kernel and JS versions?
@facekapow not yet, I'm looking into Travis CI based automatic deployment, because prebuilt binaries need to be released too
Deleted my last comment, it was completely wrong, sorry.
@facekapow yeah, I think using releases would be fine.
Oh, ok. Guess I wasn't completely wrong. Comment I deleted:
If you deployed through Github Releases, you could just use Travis CI's Github Releases. All you have to do to support Github Releases is add the extra path components to the download URL in runtime-cli.
@facekapow this has been released :+1:
Also configured Travis CI to build releases and publish tags to npm automatically. runtime-cli
major update required though.
@iefserge Great! When I get home I'll patch up runtime-cli.
edit: nevermind, you already did.
Adds a lot more support for node. Specifically:
dns.js
(basic, only A records),console.js
,process.stdout
,process.stderr
,stream.js
. Couple of notes: 1) I had to implementstream.js
myself becausestream-browserify
kept crapping out and throwingReferenceError
s everywhere. 2) Also aboutstream.js
: I have to implementstream.Transform
. 3) Again,stream.js
: I had to copy-paste forstream.Duplex
since it inherits bothstream.Readable
andstream.Writable
(if anyone knows how to do multiple inheritance in ES6, please let me know how). 4) I threw in Promises for some of the functions that take callbacks (they still accept callbacks, though). I figure it's not gonna be long until Node uses them, however, should they just be left as plain old callbacks? If I leave them, should I put Promises on EventEmitter, too?