OGRECave / ogre

scene-oriented, flexible 3D engine (C++, Python, C#, Java)
https://ogrecave.github.io/ogre/
MIT License
3.85k stars 959 forks source link

Material Pass pointAttenuation not working as expected #3086

Closed Niproblema closed 3 months ago

Niproblema commented 3 months ago

System Information

Detailled description

First of all setPointMinSize(1.0f) and setPointMaxSize(2.0f) are not taken into account anywhere. Attenuation comes up with arbitrary value that is not clamped

Setting setPointAttenuation(true, 1, 0, 0) results in huge points, even though it should be equal to disabled

setting any other value is also pointless as it randomly increases point size with distance.

paroj commented 3 months ago

note that in Ogre the given point size is scaled by the viewport height as in D3D. The parameters are passed here: https://github.com/OGRECave/ogre/blob/a98c9ee8a6b93f9cf2d0d1efdaf4e0b7002d3c74/RenderSystems/GL/src/OgreGLRenderSystem.cpp#L1293-L1320

Niproblema commented 3 months ago

How may I use that to fix my attenuation 🤔

paroj commented 3 months ago

you can place a breakpoint in the above function and check whether all values make sense

Niproblema commented 3 months ago

I get values minSize = 98 and maxSize = 980, which is completely wrong. I really need this to work. Cold you suggest me a workaround? Perhaps I could set it in a shader?

paroj commented 3 months ago

is that before or after this line?

minSize = minSize * mActiveViewport->getActualHeight();
Niproblema commented 3 months ago

After, at line

mStateCacheManager->setEnabled(GL_VERTEX_PROGRAM_POINT_SIZE, true);
paroj commented 3 months ago

then the values are plausible, as they were multiplied by your screen height.

If you want to have minSize = 1px, dont setPointMinSize(1), but rather setPointMinSize(1.0/1080), if you are at 1920x1080.

Niproblema commented 3 months ago

Ah you are correct, I managed to get it working.. kind of. I find attenuation parameters extremely confusing. setting setPointAttenuation(true, 1.f, 500.f, 500.f); seems to be a decent setting for my scene. Looking at the equation attenuation=1/(constant+linear∗dist+quadratic∗d^2) I have no idea why such parameters might work. I also noticed that enabling FSAA turns my billboard sprites into dots on my Intel VGA machine and suprisingly to solid polygons on an AMD VGA.

paroj commented 3 months ago

original issue resolved. feel free to open new issues for the GPU specific problems