Closed Niproblema closed 3 months ago
note that in Ogre the given point size is scaled by the viewport height as in D3D. The parameters are passed here: https://github.com/OGRECave/ogre/blob/a98c9ee8a6b93f9cf2d0d1efdaf4e0b7002d3c74/RenderSystems/GL/src/OgreGLRenderSystem.cpp#L1293-L1320
How may I use that to fix my attenuation 🤔
you can place a breakpoint in the above function and check whether all values make sense
I get values minSize = 98 and maxSize = 980, which is completely wrong. I really need this to work. Cold you suggest me a workaround? Perhaps I could set it in a shader?
is that before or after this line?
minSize = minSize * mActiveViewport->getActualHeight();
After, at line
mStateCacheManager->setEnabled(GL_VERTEX_PROGRAM_POINT_SIZE, true);
then the values are plausible, as they were multiplied by your screen height.
If you want to have minSize = 1px, dont setPointMinSize(1), but rather setPointMinSize(1.0/1080), if you are at 1920x1080.
Ah you are correct, I managed to get it working.. kind of. I find attenuation parameters extremely confusing.
setting setPointAttenuation(true, 1.f, 500.f, 500.f);
seems to be a decent setting for my scene.
Looking at the equation
attenuation=1/(constant+linear∗dist+quadratic∗d^2)
I have no idea why such parameters might work.
I also noticed that enabling FSAA turns my billboard sprites into dots on my Intel VGA machine and suprisingly to solid polygons on an AMD VGA.
original issue resolved. feel free to open new issues for the GPU specific problems
System Information
Detailled description
First of all
setPointMinSize(1.0f)
andsetPointMaxSize(2.0f)
are not taken into account anywhere. Attenuation comes up with arbitrary value that is not clampedSetting
setPointAttenuation(true, 1, 0, 0)
results in huge points, even though it should be equal to disabledsetting any other value is also pointless as it randomly increases point size with distance.