Closed crubier closed 3 years ago
Running Three on Node is a massive pain.
Can you elaborate?
- Create node compatible versions of at least:
src/loaders/ImageLoader.js
=>src/loaders/ImageLoader.node.js
Can you show how ImageLoader.node.js
would be different from ImageLoader.js
?
Can you elaborate?
Actually it is more using three to build "agnostic" packages that are meant to run on node, react native, and web which is a massive pain.
We built a lot of relatively complex 3D geometry/Mesh processing and similar packages on top of three r109, and we import these packages in apps that run on node, web or react native, and having a version of three that does not crash on any of the three platform was a very delicate process, needing very specific tweaks in the code AND in the bundlers.
Now we want to update three and all that goes out the window, so we would like to have a cleaner and more reproducible approach.
This might be an instance of the x y problem, so let me know if there turns out to be an easy way to do that, which I'd be very happy to hear about. From my perception, it seemed like a common wisdom that three was hard on non-web platforms, since even the docs says using three on node can be complicated
Can you should how ImageLoader.node.js be different from ImageLoader.js?
In our current crappy version, it involves conditionally using jsdom to create a makeshift DOM to pass to the image loader on node. It also involves importing fs
to load the image data from local relative paths, and injecting the data inside the image element, and other dirty tricks to deal with metro bundler, webpack, rollup and other bundlers. It has reached a point of non-return were the code is bonkers, but works (surprisingly) for all our use cases.
In this project, I'd like to take a much more sane approach, in a file that is guaranteed to only be run on node js, and no other platforms.
For reference, here is our current crazy implementation of ImageLoader for three 109 (Pretty bad I know):
import { Cache, DefaultLoadingManager } from "three";
import { document, window, navigator } from "../dom";
import { startsWith, isNil } from "lodash/fp";
import pathModule from "path";
let fs = null;
function ImageLoader(manager) {
this.manager = manager !== undefined ? manager : DefaultLoadingManager;
}
Object.assign(ImageLoader.prototype, {
crossOrigin: "anonymous",
load: async function(url, onLoad, onProgress, onError) {
if (url === undefined) url = "";
if (this.path !== undefined) url = this.path + url;
//////////////////////////////////////////////////////////////////////////////////////////////////
// START: The modification compared to normal image loader
if (startsWith("data:", url)) {
console.log("IS datauri");
} else {
console.log("IS not datauri ");
if (isNil(fs)) {
let fsModule = await import("fs-extra"); // Done once only
// Trick to solve problems caused by various packagers adding layers of "default" nesting on exports...
// It is needed for our combination of rollup for packages and metro for the react native app...
// eslint-disable-next-line no-prototype-builtins
while (fsModule.hasOwnProperty("default")) {
fsModule = fsModule.default;
}
fs = fsModule;
}
// const realPath = pathModule.resolve(url);
const imageData = await fs.readFile(url);
url = `data:image/jpeg;base64,${imageData.toString("base64")}`;
}
// END: The modification compared to normal image loader
//////////////////////////////////////////////////////////////////////////////////////////////////
url = this.manager.resolveURL(url);
const scope = this;
const cached = Cache.get(url);
if (cached !== undefined) {
scope.manager.itemStart(url);
setTimeout(function() {
if (onLoad) onLoad(cached);
scope.manager.itemEnd(url);
}, 0);
return cached;
}
const image = document.createElementNS(
"http://www.w3.org/1999/xhtml",
"img"
);
function onImageLoad() {
image.removeEventListener("load", onImageLoad, false);
image.removeEventListener("error", onImageError, false);
Cache.add(url, this);
if (onLoad) onLoad(this);
scope.manager.itemEnd(url);
}
function onImageError(event) {
image.removeEventListener("load", onImageLoad, false);
image.removeEventListener("error", onImageError, false);
if (onError) onError(event);
scope.manager.itemError(url);
scope.manager.itemEnd(url);
}
image.addEventListener("load", onImageLoad, false);
image.addEventListener("error", onImageError, false);
if (url.substr(0, 5) !== "data:") {
if (this.crossOrigin !== undefined) image.crossOrigin = this.crossOrigin;
}
scope.manager.itemStart(url);
image.src = url;
return image;
},
setCrossOrigin: function(value) {
this.crossOrigin = value;
return this;
},
setPath: function(value) {
this.path = value;
return this;
}
});
export { ImageLoader };
And here is the dom
file which is imported above (Probably one of the craziest file in our codebase, which is mostly pretty sane aha):
import { isNil } from "lodash/fp";
let realWindow = null;
let realDocument = null;
let realNavigator = null;
let isBrowserReal = false;
let isNodeReal = false;
const jsdomOptions = {
pretendToBeVisual: true,
storageQuota: 1e9,
resources: "usable",
runScripts: "dangerously"
};
// Trick packagers to NOT find the jsdom import in order to not resolve it at build time and let the try catch handle errors at runtime
// 🤡 Yes it sounds like a joke but this is the only solution we found that works on all 3 platforms with all bundlers....
const toto = {
coco: require
};
const generateJsdomModuleName = () => "js" + "dom";
const jsDomName = generateJsdomModuleName();
// const jsDomName = "js" + "dom";
let jsdomValue = null;
try {
try {
//@ts-ignore
if (isNil(window)) {
if (isNil(jsdomValue)) {
// 🤡 the second part of that sad clownesque joke...
jsdomValue = toto.coco(jsDomName);
}
const { JSDOM } = jsdomValue;
realWindow = new JSDOM(
`<!DOCTYPE html><body id="main"><p >Hello world</p></body>`,
jsdomOptions
).window;
realDocument = realWindow.document;
realNavigator = realWindow.navigator;
isNodeReal = true;
} else {
isBrowserReal = true;
//@ts-ignore
realWindow = window;
realDocument = realWindow.document;
realNavigator = realWindow.navigator;
}
} catch (e) {
if (isNil(jsdomValue)) {
jsdomValue = toto.coco(jsDomName);
}
const { JSDOM } = jsdomValue;
realWindow = new JSDOM(
`<!DOCTYPE html><body id="main"><p >Hello world</p></body>`,
jsdomOptions
).window;
realDocument = realWindow.document;
realNavigator = realWindow.navigator;
isNodeReal = true;
}
} catch (e2) {
realWindow = {};
realDocument = {};
realNavigator = {};
}
export const window = realWindow;
export const document = realDocument;
export const navigator = realNavigator;
export const isBrowser = isBrowserReal;
export const isNode = isNodeReal;
As you see our main problem originates from trying to use ONE version of three for 3 platforms.
This is why we'd like to take a different approach, starting with having one sane version of three per platform.
I like the goal of taking a systematic approach to node.js support. I've been hoping (but am not sure) that the changes can be mostly focused into key files like FileLoader and ImageLoader, and that most files, like GLTFLoader, would not need separate compilation outputs. I realize this would require a few changes to GLTFLoader itself. It would also be really valuable to have unit tests running in node.js, in addition to in a browser.
Some of the work might be simplified by importing browser globals from a common file. For example:
// src/.../platform.js
export let _document = document;
export let _window = window;
if (IS_NODEJS) {
_document = { ... };
_window = { ... };
}
// examples/jsm/loaders/GLTFLoader.js
import { _document, _window, ... } from 'three';
@donmccurdy Yes this is the direction we took in our current version. However I am wondering if we really need to keep this emulation of DOM.
Ideally I'd like to avoid having to deal with the bulkiness of JSDom, even if this means losing access to some part of Three first. But it's probably way harder without it, nd not worth it. Happy to hear opinions here.
I didn't mean to suggest emulation of the DOM -- just consolidating references to globals that might differ across environments, so that Node.js support doesn't need to touch so many files.
Yes I agree that this aspect is useful.
In practice, as you wrote, it is mostly about some form of emulation of the DOM though, but I agree it's probably the best option to have good coverage of Three in Node.
Sorry I'm unclear — I do not think we should emulate DOM APIs at all. By DOM APIs I mean the browser's interface to HTML. This is mostly what JSDom does, and shouldn't be necessary for the parts of the library most users should need in Node.js.
Providing other APIs that the browser would normally provide — like TextEncoder/TextDecoder, XmlHttpRequest, atob/btoa, and so on, is probably more critical (and hopefully easier). That could be done either by patching the global namespace in Node.js, or by ensuring the library imports from a common place, which can be "swapped" when running in Node.js, similar to the import example above.
Yes indeed this is a good idea. The only place where it is still problematic is the fact that Images (for textures and others) are moved around in Three JS as instances of the DOM's img element. So that img element (at least, as far as I know) needs to be emulated. Obviously canvas too but this is a different problem, only for rendering.
But overall yes absolutely, architecturally, I would much prefer importing the useful elements from a single place rather than using globals. Bbut that would impact a lot of files in the codebase. Would @mrdoob agree with such a large change (importing window, document, navigator explicitly) in many files just to accommodate one use case ?
Updated plan:
dom-globals.js
dom-globals.js
that exports dom globals used in three js codebasedom-globals.node.js
that polyfills and exports dom globals for node jsdom-globals.js
src/Three.js
file to export all from dom-globals.js
(to make them available to examples files, see later) utils/build/rollup.config.js
:
output: "three.node.js"
including the alias plugindom-globals.js
to dom-globals.node.js
when building for nodesrc/loaders/ImageLoader.js
=> src/loaders/ImageLoader.node.js
src/loaders/FileLoader.js
=> src/loaders/FileLoader.node.js
three
utils/build/modularize.js
:
dst
folder additionally to the jsm onenodePath
in files
items pointing to the node version of the file if relevantconvert
function to output something similar to the jsm version with: if ( keys ) imports.push( 'import {${keys}\n} from "${pathPrefix}../../build/three.node.js";' );
utils/build/modularize.js
)
examples/js/loaders/GltfLoader.js
=> examples/js/loaders/GltfLoader.node.js
examples/js/loaders/GltfExporter.js
=> examples/js/loaders/GltfExporter.node.js
test/unit/three.source.unit.js
to test/unit/three.source.node.unit.js
and comment all tests first, adding them progressively and fixing problems when they appear/Users/vincent/Code/three.js/test/rollup.unit.config.js
and add a configuration similar to the one with input: 'unit/three.source.unit.js',
but with input: 'unit/three.source.node.unit.js',
with the alias plugin with the same settings as above. In this file, add some node-specific tests (FileLoader
tests for node for example)three.js/test/package.json
to add a new test script using qunit in node ( https://qunitjs.com/intro/ ), testing the new 'unit/build/three.source.node.unit.js'
three.js/package.json
to add a "test-e2e-node": "node test/e2e/node.js"
which includes a e2e test scenario like: load a local GLTF, transform it a bit, apply a texture to it, and write it locally.Hey folks,
I just submitted a PR implementing the points @crubier mentioned above. It's all work in progress since I did not add tests but at least there can be now a separate bundle of three for node along with associated examples. I'd like to have your opinion on the DOM elements that I could have missed. So far I checked for the following globals:
export const Blob = window.Blob;
export const atob = window.atob;
export const btoa = window.btoa;
export const DOMParser = window.DOMParser;
export const document = window.document;
export const XMLHttpRequest = window.XMLHttpRequest;
export const TextEncoder = window.TextEncoder;
export const TextDecoder = window.TextDecoder;
export const decodeURIComponent = window.decodeURIComponent;
export const CustomEvent = window.CustomEvent;
The current tests suite is passing with the newly created node version (relying on JSDOM) which is not very surprising, but since I now explicitly export the DOM elements from src/dom-globals.js
, the tests are failing on the regular version, when run through "npm test". Now I plan to add some non trivial tests on highly DOM dependant objects of the node version (like FileLoader
) and see how it goes.
What's wrong with emulating/polyfilling browser APIs in node?
Nothing wrong really.
Using JSDOM to polyfill browser APIs has the advantage of being simple and easy to maintain. This is the way we are going right now, and probably the easiest one. Basically slapping JSDOM on top of Three and be happy.
My initial idea was to have some deeper integration with NodeJS in some places of the Three codebase (for example loading files using the Node FS api directly instead of using JSDOM + "file:///" URIs). But I think we are not going to go this way in the end.
One problem though with the JSDOM solution is that it needs JSDOM to be registered as a (Peer? Optional?) dependency of Three in the package.json. What's your opinion on this?
Using JSDOM to polyfill browser APIs has the advantage of being simple and easy to maintain. This is the way we are going right now, and probably the easiest one. Basically slapping JSDOM on top of Three and be happy.
I'm confused. Didn't you say it was a massive pain?
Running Three on Node is a massive pain
Ah, I can see you retracted that now.
I think it would be better to invest these efforts in improving the documentation for people that want to run the library on node.
I think it would be better to invest these efforts in improving the documentation for people that want to run the library on node.
The problem is there is no way to do it cleanly today, the only way is to stuff jsdom globals into node globals, which is clearly advised against by JSDOM: https://github.com/jsdom/jsdom/wiki/Don't-stuff-jsdom-globals-onto-the-Node-global . It feels like and is a "dirty hack".
The changes in this MR ( #20924 ) allow to have a clean and official way to run ThreeJS in Node. For example as of today, there are no tests of FileLoader in the Three Codebase because it cannot be tested automatically. This MR allows to run Three cleanly in Node, test it properly and so on. This MR only add a small maintenance cost IMO.
From: https://github.com/jsdom/jsdom/wiki/Don't-stuff-jsdom-globals-onto-the-Node-global
A common antipattern we see when people use jsdom is copying globals from a jsdom window onto the Node.js global, and then trying to run the code---intended for a browser---inside Node.js. This is very bad and you should not do it.
I'm confused. Wasn't the whole point of node.js to allow developers use the same code and libraries in front-end and back-end? Why aren't they providing the same APIs browser provide? How's deno dealing with this?
Wasn't the whole point of node.js to allow developers use the same code and libraries in front-end and back-end?
Yes mostly: Allow to create libraries in the same language for both Frontend and Backend. But it does not mean that there is 100% overlap in their capabilities:
Node has its own APIs that browsers don't have: https://nodejs.org/docs/latest/api/
I made a quick diagram to explain this:
The problem today is that ThreeJS uses some (not a lot) functions that are only available in the browsers.
The good thing is that JSDOM is a library for node which emulates most browser APIs.
Why aren't they providing the same APIs browser provide?
Mostly because they are different systems with different needs:
How's deno dealing with this?
Same as Node. v8 Engine + additional APIs, but critically, does not include most Web APIs.
Thanks for the clarification 🙏
So then my question is... Wasn't JSDOM created to solve this?
So then my question is... Wasn't JSDOM created to solve this?
Yes it was, but the current state of the ThreeJS codebase does not allow to use it correctly, as stated above, because the ThreeJS codebase expects to have Web APIs in the global window object.
From there, we have the choice between three options:
@mrdoob we have a new minimalistic version of the PR coming with:
package json
, for people who want to run three in node, they need jsdom
+ canvas
src/window.js
with polyfills for node js using jsdom
+ canvas
utils/build/rollup.config
utils/modularize.js
to generate examples versions compatible with node js, just like there is the jsm version todaySo one question: should we:
Could this be done in a separate repo/package? three-node
or something?
@mrdoob of course it is feasible, as a fork of your repo. Since our goal is to have several equivalent builds of three for different platform, it seems more natural and less prone to bug (by versions mismatch for instance) to keep all builds at the same place. Moreover, as @crubier was saying, that node version would allow to complete most "to do" tests without having them failing on node. @donmccurdy was also suggesting that it would be valuable to have unit tests running on node.js.
Example of one of the unit tests I added in test/unit/src/loaders/FileLoader.tests.js
, that passes on both node.js and browser versions:
QUnit.test( "load regular url => json", ( assert ) => {
const done = assert.async();
const fileLoader = new FileLoader();
fileLoader.responseType = "json";
fileLoader.load(
"https://raw.githubusercontent.com/mrdoob/three.js/dev/examples/models/json/suzanne_buffergeometry.json",
// onLoad callback
function ( data ) {
// output the text to the console
// console.log( data );
assert.ok( data.metadata.position === 505, "Passed!" );
done();
},
// onProgress callback
function ( xhr ) {
console.log( ( xhr.loaded / xhr.total * 100 ) + '% loaded' );
},
// onError callback
function ( err ) {
console.error( 'An error happened' );
console.error( err );
assert.ok( false, "Failed with error" );
done();
}
);
} );
Example of one of the end to end node.js tests I was able to add:
import { GLTFExporter } from "../../../examples/node/exporters/GLTFExporter";
import { BoxBufferGeometry, MeshBasicMaterial, Mesh, Scene, TextureLoader } from "../../../build/three.module.node";
import { writeJson } from "fs-extra";
export default function main() {
const exporter = new GLTFExporter();
const textureLoader = new TextureLoader();
textureLoader.load( `file://${__dirname}/../../data/image.jpg`, function ( texture ) {
const scene = new Scene();
const box = new Mesh(
new BoxBufferGeometry(),
new MeshBasicMaterial( { map: texture } )
);
box.name = "box-test";
scene.add( box );
exporter.parse( scene, function ( gltf ) {
const path = `${__dirname}/outputs/scene-exporter.gltf`;
writeJson( path, gltf, {}, function ( data ) {
console.log( "Scene written" );
} );
} );
}, undefined, function ( error ) {
throw error;
} );
}
@mrdoob of course it is feasible, as a fork of your repo.
No no, I didn't mean a fork.
I meant a different repo which adds this one as a dependency and has the additional scripts and generated files.
For every release, the maintainer of three-node
will have to run the script to generate the node.js-friendly files.
Just to be clear, the approach you propose has the following tradeoffs:
react-three-fiber
in NodeJS, but we will have to find a hack so that react-three-fiber imports three-node on node...We'd like your confirmation before we go in that direction
Yep, lets go in that direction.
Once the repo is setup and we have a clear picture of what's needed and the side-effects, we can reconsider the setup.
Thanks!
It seems that we can't publish under the names three-node
nor three-node-js
, we'll try to find another name...
@mrdoob here's the repo implemented: https://github.com/sterblue/three-universal-deprecated-fork
And published at https://www.npmjs.com/package/three-universal , should we add something in the docs of three JS in the part about running three on node ?
Yes please! 👍
@mrdoob FYI, we are reverting back to a proper fork, for the following reasons:
utils/modularize.js
), which we would have had to copy-paste into our package on each release of ThreeJSTypescript does not allow to re-export the types of a third party package (which ThreeJS is from three-universal point of view), so the non-fork solution has no good option for typescript types, apart from again copy-pasting all the .d.ts
files from ThreeJS into our package on each release of ThreeJS
At that point, each release of ThreeJS requires copy-pasting a lot of files into our repo.
So a proper fork with just a couple commits from our side, that we rebase on top of ThreeJS's main branch on each release is way more maintainable, and less "divergent", allowing future merging back into ThreeJS codebase.
This is not a fork like in "let's reinvent the wheel on our side", it's more a fork like in "this is an enhanced long-lived PR that we still hope on merging at some point in the future". Our fork will always be just a couple commits and a few additions ahead of ThreeJS main branch.
I will do a PR with Docs enhancement once our package is stable and works perfectly
For future Googlers who find this issue, I'm currently using THREE.js in a couple of different server-side contexts. Previously we used the approach with JSDOM, node-canvas
, and the now-gone CanvasRenderer
. That was painful for a couple of reasons, but it works. (We had a lot of trouble with the node-canvas
module for some reason. And persistent memory leaks that we haven't been able to invest the time in solving.)
We are now taking an approach with headless-gl
, WebGlRenderer
, and no JSDOM. Our approach is to use only parts of THREE that don't require any DOM APIs. This involves rewriting some loaders to use either node-specific approaches (e.g. the sharp
library) or ones that can be polyfilled (e.g. fetch
). We haven't found many other places that require DOM APIs just yet. I'd recommend this approach; it seems light, and depending on which parts of THREE you use it could be very adequate.
I'll evaluate the linked repo here and see if it's appropriate for us, or if there's anything we can contribute in terms of code that doesn't depend on JSDOM. E.g. loaders that could be more useful on the server.
Also, since I have a passing interest in Deno I wanted to clarify: Deno does attempt to implement a lot of the "web APIs" you find in browsers - more than Node did/does. More details here: https://deno.land/manual@v1.7.4/runtime/web_platform_apis For example, atob
and btoa
were added to Deno, whereas Node does not provide them out of the box. Deno even supports alert
😂
But that said, Deno still doesn't support DOM elements like you'd need for, e.g. ImageLoader
. I'm not sure it ever will; time will tell I guess!
@crabmusket we have published our solution according to @mrdoob 's directive.
Thank you 🎉
@crubier how does your package handle webgl in node? I can't work out from the dependencies what your approach is. Like I said I've previously used node's canvas
module and the CanvasRenderer
. Does the WebGLRenderer
have a fallback?
Yes @crabmusket here's how we typically create a renderer in node using our package and headless-gl
. We simply shim a canvas with the gl context at the right place. Most hardcoded features bellow can be customised.
import { getOr } from "lodash/fp";
import gl from "gl";
import {
WebGLRenderer,
PCFSoftShadowMap,
WebGLRenderTarget,
LinearFilter,
NearestFilter,
RGBAFormat,
UnsignedByteType
} from "three-universal/build/three.node";
/**
* Create a renderer
*/
export const getRenderer = ({
height,
width
}: {
height: number;
width: number;
}): WebGLRenderer => {
// @ts-ignore
const canvas = {
width: width,
height: height,
addEventListener: event => {},
removeEventListener: event => {},
getContext: (contextType, attributes) => {
return getOr(null, contextType, {
webgl: gl(width, height, {
...attributes,
preserveDrawingBuffer: true
})
});
}
} as HTMLCanvasElement;
// Create the renderer
const renderer = new WebGLRenderer({
antialias: false,
canvas: canvas,
powerPreference: "high-performance"
});
renderer.shadowMap.enabled = true;
renderer.shadowMap.type = PCFSoftShadowMap; // default PCFShadowMap
// Let's create a render target object where we'll be rendering
const renderTarget = new WebGLRenderTarget(width, height, {
minFilter: LinearFilter,
magFilter: NearestFilter,
format: RGBAFormat,
type: UnsignedByteType
});
renderer.setRenderTarget(renderTarget);
return renderer;
};
That's a super helpful example, thanks! I'll see how I go with it. It'd be really good to have that in the package readme ;)
Yes definitely, we'll add that in the docs!
Here's a full render using mainline THREE and headless-gl, based on your example above: https://gist.github.com/crabmusket/b164c9b9d3c43db9bddbfb83afde0319
Here's a full render using mainline THREE and headless-gl, based on your example above: https://gist.github.com/crabmusket/b164c9b9d3c43db9bddbfb83afde0319
can it support gltfloader and load texture ???
@crabmusket we have published our solution according to @mrdoob 's directive.
The easy solution to running Three in Node, Browser or React native is to use https://www.npmjs.com/package/three-universal
Thank you 🎉
I just have this demand recently ,is it can use threejs on nodejs just like on the browser??
@JSeasy my gist doesn't use the universal library, therefore if you want to use loaders you might have to do some workarounds. For example, the default TextureLoader uses the DOM, which is not available in Node by default. Other loaders probably use browser-specific APIs too. In my use-case, I'm simply creating DataTextures by reading image files off the disk.
The package referenced by @crubier allows you to use the default loaders in Nodejs, by including JSDOM which provides DOM APIs in Nodejs.
Let's goo ! Nice to see ThreeJS moving forward !
Is your feature request related to a problem? Please describe.
Running Three on Node is a massive painMaking packages that are based on Three and meant to run on Web, NodeJS, and React Native is a massive pain.
With a colleague we have plan to make it easier by providing a node specific build, just like there is a jsm specific build.
Describe the solution you'd like
The plan is to do this:
src/loaders/ImageLoader.js
=>src/loaders/ImageLoader.node.js
src/loaders/FileLoader.js
=>src/loaders/FileLoader.node.js
examples/js/loaders/GltfLoader.js
=>examples/js/loaders/GltfLoader.node.js
examples/js/loaders/GltfExporter.js
=>examples/js/loaders/GltfExporter.node.js
utils/build/rollup.config.js
:output: "three.node.js"
including the alias pluginutils/build/modularize.js
:dst
folder additionally to the jsm onenodePath
infiles
items pointing to the node version of the file if relevantconvert
function to output something similar to the jsm version with:if ( keys ) imports.push( 'import {${keys}\n} from "${pathPrefix}../../build/three.node.js";' );
Would you accept such a PR back into the main three.js repo / package ?
Describe alternatives you've considered
We use a custom sh***y fork of an old version three in our repo and it is messy, we'd like to go for a cleaner version that is now possible thanks to the cleanup of the three codebase using ES modules.
Additional context