nglviewer / ngl

WebGL protein viewer
http://nglviewer.org/ngl/
MIT License
657 stars 168 forks source link

#865-upgrade-three-js #1002

Closed ppillot closed 1 week ago

ppillot commented 7 months ago

865 Upgrade three.js to latest released version.

A legacy flag is used for the lighting mode. Ideally this should be fixed by using the latests properties required for lighting (like distance and decay) and also migrating the code to use a linear colorspace internally.

This PR requires testing (like checking that all the examples do pass while comparing with the previous version)

Note that the .ply parser has been removed for convenience as it uses many deprecated three.js features (like Face). I am not sure it's worth the effort to reimplement it. It's a common 3D format, maybe it can be useful for some debugging? Note that there are alternate implementations of such a parser that exist for three.js.

fredludlow commented 7 months ago

Thanks - a long overdue chore, and much appreciated!

Some initial test results (Windows 10 Firefox 119.0.1)

That looks like a long list but those are all the problems I could find looking through different representation types, and I think most boil down to either parameterizing lights (as per the discussion you linked) or the impostor code.

Thanks again for tackling this!

ppillot commented 7 months ago

Thanks @fredludlow for the testing! I've fixed the issues you've noticed wrt fragments (slice representation and cylinders+hyperballs impostors) The point representation example works at first try, but not after I've run the showcase/electorstatic-apbs example. That one uses the dot representation and there is an issue with the BufferGeometry (the boundingSphere has a radius value set to NaN) So... more investigations required!

ppillot commented 7 months ago

An update on the upgrade I've been through every example from the application. The remaining issues that I can see are:

In the following screenshot, an illustration of the dot issue (left is latest code, right is the current version online)

screen capture
fredludlow commented 7 months ago

Excellent:

dot representation: the forceTransparent option must be enabled, otherwise small artifacts appear at "the corners of the dots"

In the showcase/electrostatic-apbs example the dot representation looks fine to me with/without forceTransparent? Do you mean the point repr? (I mentioned it previously as I don't use it and haven't taken time to understand its options in details, but I can't reproduce the issue from your image - could you post the representation parameters that give that?)

the light is not exactly the same, resulting in less vivid colours. We can probably change the defaults, but the issue is that any user who enabled custom settings will have something slightly different. If we are going for a breaking change, it might be worth going the whole way and implement the linear colour space?

I'm a bit out of my depth to comment on this. There's all the work @garyo did in #808 to enable viewer.setColorWorkflow("linear"|"sRGB") (honestly, re-reading my nitpicks on that thread, I must apologise to @garyo for being so demanding!) but I'd be happy with a breaking change (make it v3?) where the defaults look good... Happy to schedule a call sometime if it would be easier to discuss live?

ply parser not reimplemented. Either we fix the existing one, we use another one from third party or we deprecate it.

I'd vote for making a v3, dropping it, and putting it back later if anyone misses it.

garyo commented 7 months ago

I'm still around, sorry I've moved on to other things and haven't kept up with three.js much recently -- but would be happy to review color workflows with you.

ppillot commented 7 months ago

Thanks @fredludlow and @garyo ! Sounds like we have a plan!

Here is a detailed view for the issue with the points representation. We can see that every dot is drawn in an opaque white square

image
fredludlow commented 7 months ago

Sorry @ppillot - I couldn't reproduce those white squares on my machine, but it's probably me not doing the right thing, is that from one of the examples, or have you got some specific settings for the representation that do this? (Or does it always happen for you with point representation?)

ppillot commented 7 months ago

@fredludlow this was observed with the representations/lines example In this example there are point representations to complement the lines at the spots where the atoms lie.

image

I think that the issue stems from the alphaTest property. When it is set, an ALPHATEST value is defined to be used in the shaders code, but this value is not used anywhere. The alphatest_fragment chunk from three.js has been changed 2 years ago, and ALPHATEST has been replaced with USE_ALPHATEST and its value is set through a uniform: https://github.com/mrdoob/three.js/pull/22409/files#diff-530c2fc746a48d6b0fe6971851f6f1992c0cb5d3e11aab626d89d93ef9df8debL2

The point representation uses a texture which is a square where the pixels have a decreasing alpha channel value from the square center. The alpha test discards the pixels where the alpha value is below 1.0 (when set to the maximum value in the slider), which results in a disc of radius 1. I can reproduce the same image with the previous version of NGL by setting the alphaTest value to 0 which disables the test completely.

ppillot commented 6 months ago

Did some work to implement the linear workflow. The issue with the lightning has been addressed using a DirectionalLight instead of the previous spotlight. Most of the changes regarding the color spaces have consisted into using only Three.js defaults. Here is the current result ("old" code on left, "new" code on right) :

image

Note that the antialiasing pass results in an image with additional fog effect. Disabling the fog seems to give a result closer to the original one, so it's possible that some tinkering should be made to the fog as well.

Next steps would be to look at:

ppillot commented 6 months ago

I've found a way to make the conversion to linear colour space without resorting to the decorator and double conversion.

The "only" issue that remains is the antialias processing which tends to bleach the colours. I have also noted some banding appearing:

image

(Edited: banding was due to a loss of precision, fixed by changing the RenderTargerts type to HalfFloats:

image

)

This does not happen when the antialiasing is disabled (sampleLevel = -1).

From what I have read this is due to the conversion from linear space to rgb color space that happens before the antialiasing is computed. There is something like a double conversion that is happening. The documented Three.js way to do it is to use a postprocessing stage that is done in linear colour space and have a final output pass that will do the conversion to screen color space (linear --> rgb) only once.

This might allow to implement other post-processing step like adding AO lighting.

ppillot commented 2 months ago

Continued the upgrade and the investigations. A major bug remains in relation with the supersampling (used for antialiasing) when using semi-transparent materials. The material appear very shiny. Here is the result without supersampling:

image

Here is the result after supersampling:

image

The user experience is quite bad as the antialiasing is applied only when no mouse move is made, so it really stands out as a glitch.

For a before after comparison, on the left this is the current production code, on the right this is the result with the latest changes in this branch:

image

The cause for this is probably an issue with color space conversions. I suspect that during composition the blended scenes are rendered with upscaled luminosity (there is a π factor between light intensity in the legacy colorspace and the more recent LinearSRGB) which results in this when the scene is finally rendered.

Note that I also tried to use Three.js postprocessing pipeline which is designed to address this kind of issues. The general idea is to implement a composer where the internal rendering is done, through configurable passes. One pass is the anti aliasing and the last past is the output pass for which colors are converted to the output color space (the screen is in the SRGB color space). This did not solve the issue with the semi-transparent surfaces, it made the whole rendering slower (anti-aliasing is applied at each frame). Interestingly, I have also tried the ambient-occlusion passes. 3 different techniques are available but only one was giving results (SAO) and it seemed most of the time inaccurate (was creating dark patches when the zoom level does not match the size of the cavity).

I feel that I have already spent too much on this and that given my lack of knowledge there is little chance for a positive outcome. I envision to take a step back, close this PR within a few days and open a new one with an upgraded version of Three.js but not the latest, and using the legacy light model.

Note that the latest version of Three.js (0.163) contains only the .esm compiled library (ecma script modules). I think this would make impossible for NGL users to use their own Three.js version.

ppillot commented 1 week ago

Closed this branch as the issue with antialiasing could not be fixed. Superseded by PR #1037