KhronosGroup / glTF

glTF – Runtime 3D Asset Delivery
Other
7.2k stars 1.14k forks source link

[PBR extension] Parameter Set #696

Closed mlimper closed 7 years ago

mlimper commented 8 years ago

Initially, we have proposed two parameter sets: "Specular-Glossiness" (short: "glossiness") and "Metallic-Roughness" (short: "roughness"). One the one hand, this allows for more freedom, as assets may be given in one or the other format. On the other hand, it might be better to have only one parameter set, for increased simplicity, ease of implementation and portability.

If we agree to use only one parameter set, we will have to decide which one that should be. The roughness model might be popular and easy to use with several existing systems. The glossiness model allows for a bit more freedom, as the proposed specular component has three channels.

pjcozzi commented 8 years ago

I don't have the PBR experience to say which to go with, but I agree with only having one model in the extension; if it is needed, another extension could do another model. This is inline with the spirit of glTF, which is to keep the client as simple as possible and not provide too many different ways to do the "same" thing.

lexaknyazev commented 8 years ago

if it is needed, another extension could do another model

In that case, all PBR materials extensions should agree on light object model and probably keep lights in yet another extension.

Also, since material.extensions can contain many objects, there should be some restrictions on co-existence of two PBR extensions in one material.

mlimper commented 8 years ago

Yes, lighting is another topic that we need to discuss at some point. Will this be part of the extension, or yet another extension?

Adding @vorg @jeffdr @selim-bekkar-sb @tparisi to the conversation.

@ alll: What is your opinion about the number of parameter sets we should support? What would you vote for:

Let's keep in mind that glTF is a format for efficient asset delivery, not for editing. Furthermore, consider that, on the one hand, we certainly want to have an expressive model, but, on the other hand, we are still fine when we can serve 95% of the use cases (not necessarily 100%).

Please add any one you like to this voting - and thanks in advance for your votes :-)

jeffdr commented 8 years ago

I would vote 'glossiness', but I don't have a strong opinion. Not much need to support both IMO since one is pretty easily derived from the other for conversions.

vorg commented 8 years ago

I would go for Metallic-Roughness with potentially changing Roughness to Smoothness = 1 - Rougness (as per Unity and ThreeJS)

mlimper commented 8 years ago

Ok, it seems a lot of frameworks / implementations support both models, which could be a reason to go for both

tsturm commented 8 years ago

@mlimper - That is right, a lot of frameworks support both material workflows including Marmoset Toolbag, Sketchfab, Unity, Substance Designer etc.

cedricpinson commented 8 years ago

yeah a lot of sketchfab users uses metal-roughness workflow, but some specular-glossiness. I dont have a strong opinion on that but both are widely supported in software so it's maybe better to have both in the spec.

cedricpinson commented 8 years ago

How is handled F0 texture ? do you plan to support it ?

tsturm commented 8 years ago

Yes, I think it makes totally sense to support F0 textures in the metal-roughness workflow to change the constant F0 value for dielectrics.

pjcozzi commented 8 years ago

glTF is designed for runtime; often there may be many workflows that generate the same runtime data so I would only support the two different approaches if one can not faithfully be converted to the other, and they are both widely used; otherwise, supporting both adds spec and client complexity that glTF aims to avoid.

RemiArnaud commented 8 years ago

Yes gltf is for runtime, but I would be careful with your statement above. gltf intent is not to dictate how runtime works or unify to one single run-time implementation. However I do agree that client complexity should be avoided, but not by imposing a rendering model. gltf is a common language and a mechanism for run-times to get what they need, and only what they need, through a parametrised offline or online pipeline conversion process.

Going back to PBR, there is no clear winner and several rendering approaches should be available with Gltf. It won't make clients more complex, as each client should get only the representation it needs.

mlimper commented 8 years ago

It won't make clients more complex, as each client should get only the representation it needs.

I am not sure if that should be the way to go, as we will want clients to support either the full extension or nothing. Otherwise, we could end up with fragmentation, where each client / writer combination supports only one model, which would lead to glTF PBR assets not being exchangeable across implementation.

To judge whether we should support both, I guess it makes sense to have a simple example implementation first, so one could clearly see what would happen in a client that aims to support both.

cedricpinson commented 8 years ago

I have an old example here https://cedricpinson.github.io/osgjs-website/examples/pbr/ and the code to use the different workflow metalness-roughness / specular-glossiness is here https://github.com/cedricpinson/osgjs/blob/master/examples/pbr/shaders/pbrReferenceFragment.glsl#L248-L255 It could be simplified

mlimper commented 8 years ago

Thanks, very cool! Looks very straightforward.

tparisi commented 8 years ago

@mlimper I agree 100% we should not be working on client-specific PBR here... defeats the purpose of glTF and could lead to fragmentation.

RemiArnaud commented 8 years ago

@tparisi @mlimper

It's never been a gltf goal that run-times must be able to load all gltf content. It's nice that they can, but the main goal is to make it easy for runtime a to use the language and not to unify and limit usage.

It's more important for gltf to satisfy the needs for direct consumption of 3D content, meaning ability to represent what is needed and only what is needed by a given run-time, than to see that it is possible to write a universal player.

emackey commented 8 years ago

only what is needed by a given run-time, than to see that it is possible to write a universal player.

I'm perhaps biased coming from Cesium, but certainly I think Cesium wants to be able to ingest as many glTFs as possible. I would imagine ThreeJS has the same goal.

If we end up in a state where the typical glTF model is highly tuned for a particular target runtime, and can't be loaded by a generic ThreeJS glTF loader or a generic Cesium glTF loader, then glTF is not going to lay claim to being the "JPEG of 3D" as was mentioned at the Khronos BoF this year.

There are already 3D formats that target specific runtimes. There are already formats like OBJ that can deliver simple geometry to almost any 3D application, so long as you don't need PBR shading on such objects. I hope I'm not alone in expecting glTF to fill a long-standing gap in standards, delivering models with high-quality shading and animations to a range of generic 3D model loaders on multiple platforms.

mlimper commented 8 years ago

I hope I'm not alone in expecting glTF to fill a long-standing gap in standards, delivering models with high-quality shading and animations to a range of generic 3D model loaders on multiple platforms.

Agree to 100%!

glTF will be the "JPEG for 3D" - of maybe the "PDF for 3D" (although PDF has 3D... but that's a different topic ;- )... saying, it will be a standard 3D delivery format, where assets should be exchangeable across different rendering platforms and look as similar as possible.

emackey commented 8 years ago

So to bring this back on topic, let's consider the question of which PBR parameter set to use. I'm not an expert in these parameter sets, but in light of the above discussion, it seems clear that splitting them into two separate PBR extensions would introduce too much fragmentation. This fragmentation becomes a show-stopper issue if core GLSL is replaced by core PBR as proposed in #733. The remaining options then, as best I understand them, are:

Is there a simple conversion from one parameter set to the other, without loss of visual fidelity? If there's a straightforward conversion, then it makes sense to take option B: include that conversion in the glTF writers, making the job of the readers that much simpler, which is a key goal.

But if there's no deterministic conversion that preserves the artist's intent for the model (that is, if converting one to the other causes loss of perceived quality), then unfortunately we must consider asking readers to support both parameter sets. We would run the risk that some readers may choose to only implement one of the two, causing fragmentation.

jeffdr commented 8 years ago

It sounds like, given the goals of this format, option B is preferable. If that's the case, then I would feel fairly strongly that the metalness parameterization should not be used there (use specular/gloss instead). The reason being that it's fairly easy to convert metalness/roughness to specular/gloss, but the reverse is not always possible. The specular setup is the most general purpose; so if you're going with only one in the official specification, that should be it.

significant-bit commented 8 years ago

From a toolsmith's perspective: If I have a Blender --> Unreal pipeline, and both are using the same parameters / material model, the asset delivery format should support that. Converting to other parameters on save then reverse converting them on load doesn't help me at all. Would this use case be better served with custom shaders instead of using a "standard PBR" extension?

Insisting glTF should support only one seems like insisting images should be either color or grayscale. The JPEG of 2D does both. QED

jeffdr commented 8 years ago

The idea, I suppose, is that you wouldn't be converting them on load - everyone writing apps conforming to the glTF spec would just use the specular model for their shaders, which would always match the content.

That's a problem for people who might have their own reasons to use metalness shaders at runtime (though off the top of my head I can't think of any), and so I'll have to defer to the group to decide whether supporting only one workflow is wise for glTF. But if it is going to be only one, it should be "specular".

emackey commented 8 years ago

Thanks @jeffdr for the conversion clarification.

@significant-bit I hadn't considered the case where a runtime actually wants the metal/rough parameters that are hard (impossible?) to convert to in a lossless way. Does the Unreal pipeline really require metal/roughness parameters, or enable some rendering with those parameters that can't be accomplished with spec/gloss? What would this pipeline do with glTF models that did use the spec/gloss parameters?

In general, it sounds to me like the primary PBR transmission format should be spec/gloss, with glTF authoring toolchains strongly encouraged to convert any metal/rough to spec/gloss parameters on the way through. I would imagine that most rendering engines, certainly the WebGL-based ones, would find the spec/gloss model more similar to classic rendering and have an easier time supporting that (remembering that we're trying to limit complexity of the loaders, since they may be on mobile devices etc).

If there are real, concrete advantages for some engines to rendering directly with metal/rough, then perhaps a separate metal/rough PBR extension is called for after all. This would enable glTF to target these runtimes directly, but again, would lead to fragmentation. In this case I would think that #733 would promote specifically the spec/gloss PBR to core glTF, and leave metal/rough as just an extension. The extension may not gain much traction unless it is trivial for the runtime to convert on-the-fly the metal/rough textures to spec/gloss. From WebGL's point of view, it's always better to preprocess what you can during authoring, rather than make the client do it at runtime.

What do folks think of this interpretation?

vorg commented 8 years ago

Hi guys, I've been following following the spec evolution since the begging and always was on the metallic side so i feel like it's the last chance to sum up what we are giving up here if we go with specular:

So as much as personally I prefer metalness as I think is more elegant conceptually (color + roughness + mask) vs specular workflow (diffuse colors + specular colors + inverted roughness) even if i'm giving up flexibility. Saying that I'll adapt as I want to build my tools around glTF not the other way around.

emackey commented 8 years ago

Interesting point that the metallic workflow saves two channels of texture data, that could make a compelling argument to support both.

Earlier in this thread, @cedricpinson posted this sample code that shows the choice of metal/rough vs spec/gloss as being just an extra 4 lines of fragment shader code inside a compiler directive. Is that really all it takes to support both? Not counting glTF parse code of course. So adding a couple lines of GLSL and using metallic workflow frees up two channels of texture memory?

stephomi commented 8 years ago

I would say I'm in favor of 2 workflows (in one single extension). But if I'd have to choose for one workflow, I'd go for the metalness.

Metalness workflow saves 1 texture channel (and most of the time 2 channels since F0 channel is seldomly used).

Specular workflow takes more memory and allow for more freedom but is more prone to error (which is more prone to error energy conservation). But it some cases it has better filtering (for example here you find similar comparison in the awesome allego pbr guide that everyone should read ) so rendering wise it's sometimes better.

would find the spec/gloss model more similar to classic rendering and have an easier time supporting that

It's similar only by the name, if you translate naively old classic rendering textures to a pbr specular workflow, you'll end up with a model way too bright.

Also about glossiness/roughness, as @vorg pasted in his survey, they are completely independant from the metalness/specular workflow. UE4 uses metalness and roughness while other engine uses metal+glossiness. Maybe the 2 channels should be present regardless of the workflow too (although mutally exclusive).

pjcozzi commented 8 years ago

To clarify glTF's general position on one parameter set vs. two, the spirit of glTF is to make the client as simple and efficient to implement as possible by moving complexity to the content pipeline, just like proprietary runtime formats.

Very loosely speaking, one PBR parameter set would be preferred, but if the implementation burden of having both is not significant and it allows a much wider array of use cases and/or performance tradeoffs, then it would would still be within the spirit of glTF. So, @emackey's question is important to help decide this:

Earlier in this thread, @cedricpinson posted this sample code that shows the choice of metal/rough vs spec/gloss as being just an extra 4 lines of fragment shader code inside a compiler directive. Is that really all it takes to support both? Not counting glTF parse code of course. So adding a couple lines of GLSL and using metallic workflow frees up two channels of texture memory?

stephomi commented 8 years ago

Yes it's only what it takes.

Same for the roughness/glossiness conversion (except in the annoying case where some pbr engine are doing a remapping, e.g: unity but that's not really relevant to the discussion as the issue exists regardless of roughness/glossiness choice).

RemiArnaud commented 8 years ago

"As we focus on PBR (issues), it may very well become core spec, and this is the direction the industry should move" says pjcozzi in https://github.com/KhronosGroup/glTF/issues/733

Which is different goal than the subject of this thread first suggest "PBR extension". I personally welcome this objective to have a PRB based core generic material /lighting model.

Having to very well specified PBR modes in core is not a problem IMHO, especially if the application can ask the server for it wants and receive only what it needs. We are in the era of on-demand streamable content, not pre-packaged un-mutable files (or game cartridges).

emackey commented 8 years ago

It sounds like supporting both is a valid strategy. Remi, I don't think the typical REST pattern (of which I'm a huge fan) is the right approach here. The client doesn't have to ask the server for a more preferable parameter set, since "preferable" depends on the model itself. If the model was authored with the metal/rough parameter set, then that's the preferable one. Such models are more tightly constrained to the intentions of PBR, and take less texture memory. But, some models will have been authored without metal/rough, using the more traditional spec/gloss model. Those models typically go outside the metal/rough constraints, and take more memory as a result, and there's no way to automatically convert them over to metal/rough models. So, the server should always return metal/rough if the model was authored with that style. But, lots of older models won't have that mode to offer, and will use the spec/gloss instead, with no conversion path available. (Someone chime in if I got any of this wrong :)

jeffdr commented 8 years ago

Seconding what Ed wrote here. The reason we're considering the two formats is that they are not always interchangeable, so it won't solve the problem to have clients ask servers for whichever they'd prefer.

Myself I'm coming around to the idea of just supporting both. It's a small burden on the client, but it grants so much more compatibility that it's probably worth it.

RemiArnaud commented 8 years ago

Those last comments are identical what I wrote much earlier in this thread: gltf should be able to store either PBR models, and run-time are free implement one of both - depending in their goal.

Its nice to know that the difference in code is minimal and so make it easy for a run-time to implement both. But is it really possible? What happen when mix/match models created for different PBR lighting models ?

As Ed mentioned, it may or may not be possible for a server to send the model requested (preferred) by the client, although in general it is preferable to put that burden on the server than the client in general. What the client does when this happen is not specified in by the gltf spec AFAIK.

jeffdr commented 8 years ago

I'm not quite sure what you're asking Remi - what happens when a client has to use both metalness and specular materials? The client would have to build multiple shaders or be willing to use dynamic branching to do both at once. But the two models coexist in scenes together just fine all the time.

RemiArnaud commented 8 years ago

@jeffdr - yes, I am wondering if one run-time can mix and match models using different PBR light model in one single render. I never studied that, only used one or the other model.

mlimper commented 8 years ago

Wow, great discussion!

Looks like there are reasons to go for both (such as choosing between freedom of expression vs. memory efficiency), if they are not too complicated to handle on the client / runtime side.

@tsturm and me currently try to setup a simple WebGL example application that illustrates how a glTF asset, using the current draft of the extension, would be rendered (including comments, shader code, etc.). We'll provide the example for both material models for now, so we can see how a runtime implementation for the current draft extension would look like.

stephomi commented 8 years ago

yes, I am wondering if one run-time can mix and match models using different PBR light model in one single render. I never studied that, only used one or the other model.

Metal vs Specular are not "different PBR light model", they are just different 2 different ways to represent the same material (and using the same PBR light model), the differences being the number of channel used, possible artefacts in rendering due to texture filtering (transition between metal/non metal), and more flexibility in the specular workflow (with risk of screwing up energy conservation).

I don't think it's common practice to mix the 2 workflows in a game engine, but don't take my words as granted :). If you do deferred-base shading, you'll probably have to stick with the metalness workflow, and if you use specular texture, you'll loose the somewhat controversial flexibility advantage (and I'm not sure the filtering advantage of filtering is worth it).

A special shader for SSS is a different PBR light model (that also requires different map). Unfortunately, PBR SSS is not as standard as the metal/spec workflow, so probably too early to introduce a gltf SSS extension.

erich666 commented 8 years ago

Wow, good discussion. Some comments:

@vorg writes:

I would go for Metallic-Roughness with potentially changing Roughness to Smoothness = 1 - Rougness (as per Unity and ThreeJS)

I'm not sure where you're finding Smoothness for Three.js. For Three.js's MeshStandardMaterial (which is meant to be PBR-ish and their new "standard"), they use metalness and roughness. Maybe you were looking at some proposal or old code? Anyway, they use Roughness, FWIW (and what it's worth is a lot: three.js will probably be the largest consumer of glTF, as a guess). The logic for three.js's material is presented here.

Myself I'm coming around to the idea of just supporting both. It's a small burden on the client, but it grants so much more compatibility that it's probably worth it.

I would rather have just one, for simplicity's sake. "I want to use PBR." "Well, you have two choices..." Not helpful to the naive user, and I appreciate that most vendors themselves pick just one (unfortunately, two different ones). The global illumination people in our little cloud/web group at Autodesk use Roughness/Metal, since they like the more principled approach. If I read the chart correctly, it's only CryEngine that's not using this model, right? Why do we care, then? The fact is, any serious game engine that deeply cares about the look will be providing their own custom shader to glTF anyway, they won't use this proposal.

Brian Karis (PBR shader guy for Unreal Engine, this paper of his is the basis for some interactive PBR shaders) wrote to me about glTF:

Epic wouldn't really be interested in this file format due to its very limited fixed function. We have some interest in an open format for sharing material node graphs like what Lucasfilm's MaterialX is trying to do.

So this PBR proposal is not going to satisfy the game engine types. I see the PBR proposal as a good first step in getting naive and even advanced users a baseline material that will respond nicely to lights, both point lights and image-based (and that's enough challenge right there).

@mlimper wrote:

Let's keep in mind that glTF is a format for efficient asset delivery, not for editing. Furthermore, consider that, on the one hand, we certainly want to have an expressive model, but, on the other hand, we are still fine when we can serve 95% of the use cases (not necessarily 100%).

I fully agree on all counts. I'd frankly probably just use three.js's MeshStandardMaterial as the PBR implementation and call it a day, with the one caveat that I'd require per-surface gamma to be on, as a minimum. See issue https://github.com/KhronosGroup/glTF/issues/700

My very long comment here summarizes some of the differences of the three.js model from others out there. The devil's in the details, and we need to fully specify these details so that there's consistency of display among applications.

pjcozzi commented 8 years ago

@erich666 makes a compelling point that this would be the base PBR standard and highly specialized folks like Epic would do something more involved or specific. @stephomi made a similar point above about subsurface scattering (SSS).

@tsturm and me currently try to setup a simple WebGL example application that illustrates how a glTF asset, using the current draft of the extension, would be rendered (including comments, shader code, etc.). We'll provide the example for both material models for now, so we can see how a runtime implementation for the current draft extension would look like.

@mlimper no rush, of course, but do you know when this would be ready? This is one of the most interesting glTF threads I've ever seen, but code almost always makes the spec decisions obvious. I think we could quickly look at this and decide it is (1) too complicated/confusing to have two parameter sets, or (2) simple enough and allows more use cases.

erich666 commented 8 years ago

To make sure we're all on the same page, see this reference for what the specular/glossiness model is. No equations per se, which is too bad, but some explanation of the differences between the two models.

This is what @mlimper wrote me about the two models:

In fact, the “glossiness” in the one material parameter set is simply the inverse of “roughness” in the other one. Therefore, in the equations, (1-glossiness) is the “alpha” parameter. Further information about the two material models, and how they relate to each other, can be found in the PBR guide from Allegorithmic.

On page 16 it indeed says of glossiness, "In this map, black (0.0) represents a rough surface and white (1.0) represents a smooth surface. It is the inverse to the roughness map in the metal/roughness workflow." So glossiness itself doesn't matter, it doesn't give us anything that roughness doesn't have.

The specular map is different: "The specular map defines the reflectance values for metal and the F0 for non-metal as shown in figure 23. This RGB map allows for different values for dielectric materials to be authored in the map. This is different from the metal/roughness workflow where dielectrics are hard-coded at 4% reflectivity and can be modified only through the specularLevel channel."

I can see the argument for specular/glossiness from an artistic control point of view: for example, you can have separate diffuse color maps and specular color maps applied to the surface. However, the current glTF PBR proposal doesn't have this feature, only a single diffuse map, so this additional functionality (specularTexture) is missing.

Even if this specular color map was added, a serious problem is that the Substance shader approach has artifacts, as shown on their pages 11 and 16. They are trying to put two materials into one shader. In this PBR proposal we're not making a thin layer car paint shader here, where one layer does one thing, the undercoat another; we're trying to express one single material type. Throughout Allegorithmic's guide they attempt to force multiple materials to be evaluated in a single shading pass, which gets them into trouble, as they show.

This took awhile for me to understand, but now I get it: if you linearly interpolate various parameters between a metal and a non-metal, the interpolated parameters used in the shading equations don't give a sensible interpolated result. The right way to handle transitions between two different materials would be to evaluate the material shader model at all individual texture taps (so properly evaluating each material with its own parameters) and interpolate those results, something we typically don't do in a GPU shader. Evaluating multiple shaders in separate passes and properly blending these is a better (though more costly) approach.

One expert on PBR at Autodesk (Miloš Hašan) notes:

"Metallic factor" between 0 and 1 can then simply be defined by a linear blend of the two cases above, which is equivalent to linearly blending the f0's and diffuse colors.

A legitimate use of this is for rusty metal. One can have a mask specifying where the (dielectric) rust is; fractional values are useful for having soft edges in the mask. I would also be cautious about the "dusty metal" example, rendering that correctly is beyond scope of these simple models, but with some artistic skill it can probably work OK.

So, it's something the shader can do, if you don't look too closely at fringing artifacts, but you're really not doing it right and shouldn't expect to get the right answer with either model. Basically, I'd like for us to avoid "can it do a dusty metal?" or "can I put a metal decal on my plastic model in a single shader?" or other use case where we know a single pass won't work. So, is there anything to be gained by specular/glossiness instead of roughness/metalness if we use the shading model correctly?

Also, it seems difficult to control the specular/glossiness model and actually be a PBR type of model. The guide notes, "For example, a white (1.0) diffuse and a white (1.0) specular value can combine to reflect/refract more light than was initially received, which in turn breaks the law of conservation. This means that when authoring the textures, you wouldn’t see the actual result corresponding to the texture data." I would personally avoid a PBR-based shading model that lets you make non-PBR shaders, as it kind of misses the point.

Miloš also notes:

Roughness vs. glossiness. (User-facing) roughness is now commonly defined such that alpha (the coefficient in Beckmann, GGX, or classic Ward BRDF) is obtained as alpha = roughness^2; this convention is used by Disney, Prism materials, Substance, and possibly others. It has a reasonably "perceptually linear" feel to it, while being simple and easy to remember.

In our experience, undefined or vaguely defined glossiness is a common reason for mismatches between renderers. In my opinion, it is better to stick to roughness across the board. In other words, the metal/roughness model should ideally be the only model (and there's no reason to have "metal" in the name, since it works well for non-metallic materials).

The big gain for specular/glossiness seems to be that the specular color can be varied separately from diffuse, which is nice but doesn't follow PBR theory for a single material (from what I understand). I don't want to be too much of a theory purist here, but if we call a model PBR, I'd like it to be that, not a hack that lets people stuff two different materials into one shader when they really shouldn't do so.

If specular/glossiness is something that people think is useful, I would suggest it be proposed as a separate "artistic" shader, one that is not called PBR, and put in a separate proposal.

stephomi commented 8 years ago

Even if this specular color map was added, a serious problem is that the Substance shader approach has artifacts, as shown on their pages 11 and 27. They are trying to put two materials into one shader.

I'm confused by what you means by two materials, do you mean metals and non-metals, isn't what the so-call PBR standard model all about? They are not mixing materials and their shader are actually very representative of what most pbr implementation are based on (actually you can even see the documented glsl shader code in substance painter; couldn't find it online though and I don't know if they are shareable). Discarding the whole PBRness of a shader because of a filtering issue is kind of rude :)

Basically, I'd like for us to avoid "can it do a dusty metal?" or "can I put a metal decal on my plastic model in a single shader?" or other use case where we know a single pass won't work.

Why would there be even a metalness texture if you can't mix metal and dust/plastic/whateverdielectric. Of course there's this filtering artefact (which is less noticeable in spec workflow), but that's just an implementation limitation (that can be solved artistically and that most 3d engine lives fine with it), the textures/materials representation underneath is still valid. Quixel/substance/3dcoat are full of dusty robots :).

I would personally avoid a PBR-based shading model that lets you make non-PBR shaders, as it kind of misses the point.

(Edit: just saw this post which is relevant to this question http://www.marmoset.co/toolbag/learn/pbr-conversion#mm) Metal workflow doesn't guarantee that your output will be PBR. For example, the albedo texels usually should be within a certain range depending if the metal texture says it's metal or dielectric : the shader won't guarantee that, it's the texture job to do so. Of course, I think everyone agrees that the specular workflow is much more bound to error if you don't use pbr texturing tool.

I don't really have a strong opinion on "metal only" vs allowing "metal and spec". The main advantage of allowing the 2 workflows was that the main textures authoring tools (substance/3dcoat/quixel) can all export to both workflow. So I was just thinking it's in a format best interest to be prepared for that, but as of today, in real-time, the metalness workflow is winning so it might not be much of a loss.

Also, what about the specularF0/specLevel (reflectivity for dielectric at 0 degree angle) channel in the metalness workflow? It has been addressed earlier but the discussion didn't really pan out. Unreal support it for example. The flexibility you loose with specular workflow can typically be regained with this channel. However, to be honest, it's seldomly authored by artists and it doesn't change the rendering drastically, but if gltf stick with metal only, I think it might be a valuable addition, at least as a single factor.

pjcozzi commented 8 years ago

An observation about glTF's impact thus far that could help guide us on excluding specular/glossiness, modifying metallic/roughness, etc. is that glTF is starting to set the standard of the "right way" to do things. I have talked to several developers who learned key frame animation or skinning based on how glTF does it (the skinning is actually a tad too complex, but we are fixing that) so the PBR extension is likely to set the same standard so we would like it to be true to its name / common industry practice, whatever we decide that to be.

stephomi commented 8 years ago

[...] is that glTF is starting to set the standard of the "right way" to do things.

In my opinion, only good way to know is :

It's not the easiest task and it's a bit boring, but the only solution to not be excluded from the get-go (or in a near future).

[...] standard so we would like it to be true to its name / common industry practice

My first thought was (very off-topic, so big sorry in advance) :

Maybe I'm confused at what gltf is targetting really, @erich666 mentioned at some point that three.js might the biggest users of gtlf, is this a shared opinion?

erich666 commented 8 years ago

@stephomi writes:

I'm confused by what you means by two materials, do you mean metals and non-metals, isn't what the so-call PBR standard model all about?

The PBR we're talking about indeed allows metals and non-metals. Where you get into trouble is when you try to represent metals and non-metals in the same single set of texture maps defining a specific object, for the reasons I mention. You can get artifacts if you do this, instead of taking an approach of applying two separate PBR shaders, one after the other, to the same surface.

Please do look at the Allegorithmic document, which this proposal uses as a touchstone and a reason for having two models. See pages 11 and 16. They are trying to mix a metal and a dielectric on the same surface. In other words, they're trying to have a single shader evaluate two separate materials on the same object. This works fine for individual samples (e.g., if you didn't interpolate between texture samples), but as soon as you interpolate you start to feed bad "part-metal, part-not" parameters into the shading equations and get white or black fringes. I'm not saying that it's illegal and immoral to use such shaders, you just have to expect artifacts. Hide them with noise :)

Both models have this problem, so it's not an argument for or against either model per se. What does matter is that, IMO, it weakens the use case of "dusty robots" as something by which to judge each model. Recall that Brian Karis at Epic (Unreal Engine) has already said that the PBR models proposed won't fulfill Unreal's artistic requirements, so they won't use them.

The argument I'm seeing for specular/glossiness is that it gives more artistic flexibility in mixing materials. I actually get this - one shader pass vs. two is certainly more efficient and uses less memory - but there are plenty of other parameters and maps that could be added to the model for other sorts of control, see the three.js MeshStandardMaterial for a long list not included in this proposal. I wouldn't add all these, I'd keep it simple (that said, there needs to be some discussion of how the equations work with point lights vs. environment map lighting).

Of course, I think everyone agrees that the specular workflow is much more bound to error if you don't use pbr texturing tool.

My main point is that specular/glossiness is not an energy-conserving system in any real way, which to my mind disqualifies it from being called a PBR-based model. Allegorithmic agrees with you, says the same, on page 13. If you have to instruct users to not set both diffuse and specular to 1.0 because that's not energy-conserving, that's bad. Roughness/metalness doesn't have this problem.

We're not going to make the pros happy. The pros are pros, though: they can export their own shaders and control these however they wish. I see a PBR material as an attempt to have a good default material for users, one that when you change the lighting conditions, you don't have to rebalance a bunch of parameters (other than perhaps an overall exposure parameter for the scene). That's where a lot of game developers find the value of PBR: artists don't have to mess with materials when the lighting conditions change. I believe that's a good goal for basic users in general: use the PBR material and your results will have some sort of realism to them, they'll respond in a sensible way when you change the view or lighting. (That said, pure metals will be pretty odd-looking if you don't use environment lights. I suspect many users will cheat and make things more half-metal, which are rare in the real world.)

Maybe I'm confused at what gltf is targetting really, @erich666 mentioned at some point that three.js might the biggest users of gtlf, is this a share opinion?

We'd all be in denial if we thought glTF didn't have anything to do with OpenGL/WebGL. ANGLE shows that you can go from OpenGL to DirectX quite well (line thickness being the main stumbling block, as I recall, and almost no one cares about that, certainly not in gaming), so I don't think we're including anyone out.

The main advantage of allowing the 2 workflows was that the main textures authoring tools (substance/3dcoat/quixel) can all export to both workflow.

That's the strongest argument in favor of having both. But, as I mentioned, the current proposal doesn't have the specularTexture for the specular/glossiness model, for example. It's fine to add it in, and as you point out, there are other things that are worthwhile and could be added in. Where does this process stop? My feeling is "make it a separate model, don't call it PBR." I'm for telling people "here's a reasonable starting material, especially with image-based (environment) lighting," vs. telling them "here are two materials, one's more PBR than the other, but go use which one you want." Either one is fine, I guess, but personally I'd start small and add one and call it PBR, then add another "artistic" model if there's demand.

If there's enough push for specular/glossiness to be added (and, really, @mlimper and friends are probably the biggest deciders, in a sense, in that they're doing the real work), all I ask is that there's a sample reference, so that people using this model can properly modify their data. Which is a question in its own right: roughness/metalness is pretty well defined at this point, with some variation here and there. I'll repeat Miloš' comment here:

In our experience, undefined or vaguely defined glossiness is a common reason for mismatches between renderers.

Imagine your system stores values in your texture map for glossiness in one way, mine stores my own glossiness for my system in another way. At least one of us will have to convert our texture maps so that they fit with glTF's definition. Ugh. Easier to just write your own shader and export it.

Minutiae: I ran my earlier comments past Miloš, who actually knows what he's talking about when it comes to theory :) He agreed with them (whew), with some minor points:

Specifically for Schlick approximation, blending the f0 values between two BRDFs (that only differ in f0) gives the same result as blending the final BRDF values. (Homework: prove this claim. :) ) Of course this is not true in general, e.g. blending the index of refraction or roughness values linearly will not be the same as blending the BRDFs. But for Schlick f0 it works.

In addition to f0, one can introduce an additional "specular coefficient" ks, which simply scales the specular lobe. This should be used with care since in most cases ks=1 gives the right look, but an artist may want to modify it to approximate self-occlusion, fingerprints and such.

lexaknyazev commented 8 years ago

My first thought was (very off-topic, so big sorry in advance) :

  • biggest industry 3d engine (game engine at least) doesn't care at all about webgl vs directx (vs ps4 stuffs,etc...). So by targeting *gl only and going as far as putting all the gl flags inside the format, I have big doubt on how gltf could be used outside webgl echosystem.
  • for example the textures, I don't know any 3d texturing tools that exports textures upside down by default (flipY opengl flag), for me that's not really targeting industry practice

Sorry for being off-topic, we are trying to address these two points in #733 and #739 for the next glTF revision (post 1.1). Please consider reviewing them for possible PBR issues.

jeffdr commented 8 years ago

If a little bit of market data would be helpful, I can provide some. I write a tool with a PBR renderer that is in use by several thousand artists today, and the breakdown we see between specular and metalness is roughly 50/50. So there really isn't a dominant workflow right now. You'll want to either support both, or pick the one that is most versatile (specular). Otherwise you'll have a serious barrier to preparing content for your format. I'd like to be able to support glTF export from Toolbag sometime, but if the definition is metalness-only, that makes my task harder than if it were anything else. I've mentioned this already in this thread, and why automatic conversion from specular to metalness doesn't work, so I'll just leave it at that.

The only other thing I'll add is that the claim that the specular model doesn't enforce energy conservation is false. All you have to do is mask the diffuse light by the specular reflectance; this is very analogous to what the metalness model does as well. In fact you have the additional advantage that if you want to support old "non-PBR" art, all the specular model has to do to render them correctly is skip that masking step for those materials.

erich666 commented 8 years ago

@jeffdr thanks for speaking up and bringing this topic up.

I guess it comes down to the goal of the PBR material, which I frankly don't know. Is it to give a starting material for intermediate users? I like that goal, as naive users may stick with Blinn-Phong and experts export their own shaders. But if the goal is to support various creative applications and let users more easily create web-ready content from these applications, then that's something else. I suspect that goal's very hard to achieve, from my own experiences over two decades at Autodesk. Getting Max and Maya and various other packages in the same company to match or transfer rendering styles is quite difficult and takes a lot of effort. Even within the web services group I'm in, we spend an inordinate amount of time getting our web viewer to match our desktop rasterizer to match our ray tracer - it's surprising, you'd think it's copy-the-code easy, but there are always differences in capabilities and in light sampling methods. Products in different companies are very unlikely to match, even if they use the "same" parameters, so a PBR material for these products is unlikely to fulfill their needs.

Or do they mostly use the same equations for specular/glossiness (or roughness/metalness, for that matter) right now? I'd be (happily) surprised, if so. Actually, I guess those using specular/glossiness just have to convert to the common reference model.

If they can't, I guess the fallback position is, "the export sort of looks like what I designed" - OK, but that doesn't sound like a customer-pleaser. I'd hope the various packages would export their own shaders to maintain the look (though admittedly that's work to add).

All you have to do is mask the diffuse light by the specular reflectance; this is very analogous to what the metalness model does as well.

Fair enough, and not what the spec says currently.

In fact you have the additional advantage that if you want to support old "non-PBR" art, all the specular model has to do to render them correctly is skip that masking step for those materials.

Fair, too, but another parameter missing from the spec; call it "usePBR" and set it to false.

jeffdr commented 8 years ago

Yes that's a good point @erich666. Part of the goal of "Physically Based Rendering" is increased consistency between lighting conditions, attained by greater focus on material definition, and so any PBR spec should probably have consistency across renderers as an important goal.

On the other hand, I think we're likely to find that such consistency is not present among the ecosystem of art tools and renderers today, PBR or otherwise. The specular vs metalness debate is just the tip of the iceberg. In recent years we at Marmoset have found different selection of BRDFs, gloss/roughness interpretations, energy conservation, and more. Not to mention all the 'other' stuff that many artists want - AO, cavity, emissive, detail maps, skin shading models, etc. So making a durable PBR definition is a formidable task, and there isn't going to be a single authority that can just tell this group exactly what the standard is - because there isn't one.

So my best advice is to just take a stand somewhere in the middle. Make a format that's simple, and has all the big features. Define in exact terms how you intend each map to be used, and a decent exporter will be able to convert pretty faithfully to your format.

stephomi commented 8 years ago

Products in different companies are very unlikely to match, even if they use the "same" parameters, so a PBR material for these products is unlikely to fulfill their needs. [...] Or do they mostly use the same equations for specular/glossiness (or roughness/metalness, for that matter) right now? I'd be (happily) surprised, if so.

For the PBR shader (metal/dielectric), they (mostly) do! When we implemented PBR at sketchfab, we compared the rendering against tools such as unreal, 3dcoat, marmoset, ddo and substance... With the exact same environment condition and correct texture, the difference should be minor. The most annoying difference that I have in mind being the roughness remapping, and you can still adapt (by knowing the source of the authored rough/gloss map for example).

Not to mention all the 'other' stuff that many artists want - AO, cavity, emissive, detail maps, skin shading models, etc.

I'd say that AO, Emissive, NormalMap, and (height/depth)Map (for displacement/parallax) can be safely included for gltf.

For SSS/skin shader, I'd say it's way more debatable, last time I check there were no consistency among 3d softwares (in both map and implementation) : thickness map / subsurface color / subdermis. But it was a while ago, and I'm not very knowledgeable on this particular subject, so I'd love to be proved wrong.

cedricpinson commented 8 years ago

I logged some data from sketchfab about PBR and channels usage https://github.com/KhronosGroup/glTF/issues/699#issuecomment-253219927