Closed cowboyd closed 5 years ago
I tried this approach of wrapping everything for the proxy server, and eventually abandoned it and wrote it just using the util
functions like resumeOnEvent
. I found this approach both easier and cleaner. Here are a few issue that I've come up against with the wrapping approach:
req
object and does some stuff with it. If the req
object is wrapped, then it doesn't work, so we need access to the raw/inner object.proxyRes
event, this event receives a proxyReq
which is itself an event emitter that I want to listen to. I couldn't come up with any sensible strategy for wrapping the values produced by an event.In summary, while this approach is interesting, I have a feeling that writing code like this will necessitate a whole new ecosystem which basically reimplements all async APIs, this just doesn't seem feasible. I think we're probably better off at the moment building small helper functions which make wrapping common patterns easy.
This makes a lot of sense. As a result, I've shifted focus to a much less ambitious target: change resumeOnEvent
-> on
since it seems to be a primitive of using effection in node.js, and make that the only change.
One thing that is becoming clear is that there is a lot of value in "inverting" native node APIs from a callback based API, to a structured effection API. It seems like every single API that we use, we end up wrapping it with either static functions, or objects that present a structured API.
This formalizes that approach and wraps node and npm APIs in a set of "virtual packages" that mirror the set of node APIs
So the
events
package is wrapped by@effection/events
and node APIs are wrapped in@effection/node
:fs
->@effection/node/fs
child_process
->@effection/node/child_process
This follows the technique outlined here https://dev.to/larswaechter/path-aliases-with-typescript-in-nodejs-4353 to implement the virtual package path aliases with the idea that these will eventually be extracted into their own NPM packages.