Closed cailuming closed 3 years ago
Hi, you can use scene.Entity_FindByName("meshName")
. This will return an entity if one exists with that name. If multiple entities exist with the same name, the first one will be returned (so I recommend to have unique entity names).
You can also set up hair particle systems inside the editor, and use the paint tool to place particles onto the mesh surface. Let me know if you need more info.
I got it, and it works as I expected finally. Another issue is the jagged edge on the texture alpha blending, or may be alpha testing? The original texture looks like this It seems that the original texture looks better than on the final rendering. Maybe it's just a matter of Anti-aliasing?
By the way, I have tried to use paint tool in editor, but still confusing about how to use, such as add/remove trangle on HairParticle , when I select it and paint on the focused grass, nothing happened, Or I need to go through the related source code
The jaggedness is due to alpha testing. There is no alpha blending supported now for the grass (it is too expensive to have sorting, blending and multi layered lighting). You can modify the alpha testing threshold by selecting the hair particle system in the Editor, then going to the Material window, where you can adjust the "AlphaRef" slider. This can reduce the jagginess you see.
The paint tool for hair is a bit hard to use, I admit. There are three modes for HairParticle:
You must also set the particle count on the hair particle system to see something, otherwise you might not see anything if there are no particles.
For example, try using the Content/models/hairparticle_torus.wiscene model. There are two hair particle systems on it, a grass and a flower one, so you could delete one of them to make it easier to see how it works.
Thanks for your help, I will try it
Yes,it does work. Another one, actually, is not a big problem for me, I found the weather default cloud behaves 2d fbm manner, and the cloud properties couldn't affect the volume cloud in Postprocess. Could we merge the properties between the default cloud and volume cloud? Or at least expose the properties of volume cloud to user .
Yes, the volume cloud parameters should be exposed, at least some of them. I will see how it makes the most sense. I don't know what you meant by "2d fbm manner"
I'm sorry for my bad english, I mean the default weather cloud looks like a piece of thin paper even though it may be faster. the volume cloud looks better,haha By the way, do we have SkinedAnimation baking component ?
No problem and yes, the default cloud is much simpler to render, but it's 2D.
SkinnedAnimation baking? Skinned animations are supported (they will be imported easily from GLTF model format), but not baking, I'm not sure what would you want to bake in it?
I mean the skined animation data can be baked into texture for GPU instancing , or we have the character skined animation GPU instancing already?
Skinning is computed on GPU. If you instance it, then the same animation pose will be repeated for the new instance, since the skinning is only computed once by compute shader, not for every draw. This makes it compatible with raytracing too. If you duplicate mesh, then a new skinning with new animation can be applied, however it won't be instanced. So there is no way currently to have one draw call for multiple instances but different skinning animation. I think this skinning system should be still pretty fast.
So our skined animation data ,aka all frames will be pre computed and stored into buffer , and then,we just fetch the specifed bone data and weights,right? BTW, I found the "USE_LDS" macro, which is disable by default, what's the difference?
Oh god, I happened to find your blog about the skined animation ,I should learn it first
One frame of animation data is computed every frame on CPU, and bone data for that frame is uploaded to GPU. The USE_LDS is a kind of experimental optimization, there is a variant of skinning shader that uses it (skinningCS_LDS.hlsl) and one that doesn't (skinningCS.hlsl). The LDS version is used when the entire bone data fits into the groupshared memory, though this can be turned off by wiRenderer::SetLDSSkinningEnabled(false)
.
All the meshes that have skeletons are skinned in the beginning of the frame once, before any rendering happens. Their result will be written to the MeshComponent::streamoutBuffer_POS
(positions, normals) and streamoutBuffer_TAN
(tangents) and these will be used as vertex buffers from that point.
"One frame of animation data is computed every frame on CPU" , I'm even more confused, that is to say, each animation frame data will be computed during each rendering frame ? that does'nt make any sense. ٩( 'ω' )و
What I mean is animation bone matrices are computed on the CPU, and the skinning of vertices is done on GPU. These are done every frame.
So this makes sense,I got it.
I don't mean that,but have you noticed the jagged edges? When I turn on fog, as the picture shows, it appears a downside rendering rate around the edge.while I turn off the fog, it looks right:smirk:
That is caused by volumetric lights (which are affected by fog). It is because volumetric lights are rendered in quarter resolution and later upsampled to save performance. If you would like to try increase the volumetric light resolution, you can modify that in RenderPath3D.cpp: https://github.com/turanszkij/WickedEngine/blob/master/WickedEngine/RenderPath3D.cpp#L94
The volumetric light can be enabled/disabled per light with LightComponent::SetVolumetricsEnabled(true)
or the Light window in Editor.
Thank you, sorry for another question which is related to Emiter, as you can see from the picture I want to simulate a tail flame, but I couldn't find where to set the start and end size of particle, It seams that there is only size option of the entire emiter. What's more, when I slide the scaling up, the frame rate is cutted down rapidly, I don't know why.. //ㄒoㄒ//
The "size" slider sets the particle starting size. The "scale" slider sets the scaling (size * scaling) factor at the end of particle's life. Large particles will be much slower to render, so look out for that.
If I want to change weapon or equipment for my character, what should I do with the ArmatureComponent. How should I get the specified bone and attach something to it
Hi, I found a problem again, lots of holes appear on the sphere when I click ComputeNormals(SMOOTH)
You can attach an enitity to an other entity with Scene::Component_Attach(entity, parent_entity)
. You could use the weapon entity and attach it to one of the bone entities. You can use the Scene::Entity_FindByName(name)
to get a bone with a specific name.
About the spehere mesh recompute normals: I couldn't reproduce this issue, I tried with the uvsphere.obj and the icosphere.obj files that are in the models folder, but it worked as expected. Could you share the mesh you are having problem with?
Of course, but I couldn't find how to share it with you, the sphere is purely created and exported from blender as glb format. And if you still failed to repeat the case, may be you can leave me a valid email address
You could attach it to this issue for example (drop the file into the comment box)
I just modified the file extension to be able to upload, original extension is glb
Thanks. I see the problem, I will investigate it. However, there is a workaround, operations in the following order will produce correct results:
Sorry for the trouble, I will try to fix it
My pleasure, that is to say the base functionality of code is ok, it's just a matter of calling in editor
Thank you, another one, is there any method to modify the atmosphere radius ? And also, ocean system is still rendering or buzy with computing when out of far plane, this of course can be set inactive manually. But I want to make sure whether ocean system is culling automatically.
The radius parameters are currently hard coded in the shader, here: https://github.com/turanszkij/WickedEngine/blob/master/WickedEngine/shaders/skyAtmosphere.hlsli#L72 I will consider exposing these parameters in the weather probably.
The ocean is not capable of determining when to update itself right now, but it can be implemented.
Got it, thank you!
I added the atmosphereParameters to the WeatherComponent in the commit: https://github.com/turanszkij/WickedEngine/commit/fc982f19b764b6a02df86e86d8bde25da53c8e8d There is no editor gui for this now, but you will be able to modify the atmosphere params sky from your code.
Hi there , do we plan to add foam rendering to OceanComponent? or I can render foam according chopness info by myself,I don't know. Any tips?
Hi there , do we plan to add foam rendering to OceanComponent? or I can render foam according chopness info by myself,I don't know. Any tips?
It would be cool to implement, but I'm not sure when I get around to do it. If you find a good way to render them, let me know.
Hi, I have no idea about how to make a cube volume particle emitter, such as rain or snow. not just emitted from one point
Hi, here is a sample rain emitter emitted from a volume. The emitter window has a "Volume" checkbox, that will make the particles spawn inside a bounding box (emitter can be scaled or positioned,etc). An other option is to set a mesh and emit from mesh surface.
Nice rain, Thank you so much, I think I stucked at not knowing to scale the volume
Hi, I don't know how to get the index of specified Entity, For example ,when I want to create a HairParticle, I must create a hair entity and get the wiHairParticle component from the hair entity. And then I have to specify a base mesh entity to the meshID of HairParticle. As you can see from the picture above, I want to load something like a plane, I use the wiScene::LoadModel, But here comes the problem, I have no idea about how to get the entity I just load. And I found that the return value of wiScene::LoadModel is type of entity, but it may be invalid if we pass false args.