Neopallium / bevy_water

Dynamic ocean material for Bevy.
https://neopallium.github.io/bevy_water/
116 stars 11 forks source link

Normal calculation in shader #7

Open noebm opened 1 year ago

noebm commented 1 year ago

I might be misunderstanding something but shouldn't b be along the x axis and c along the y axis?

https://github.com/Neopallium/bevy_water/blob/68889d28b346b7ff47d793bf02e1464742f45127/assets/shaders/water.wgsl#LL109C1-L111C66

Neopallium commented 1 year ago

I just tried:

  let b = get_wave_height(w_pos + vec2<f32>(1.0, 0.0));
  let c = get_wave_height(w_pos + vec2<f32>(0.0, 1.0));

I don't notice any difference. I am not an expert in shaders or 3d math. To get the normal I just picked 2 neighboring points (3 including the current frag_coord) and made sure the normal would point above the surface.

The points currently are:

a*******
bc******
********

With 1.0, 0.0 and 0.0, 1.0 the points would be:

ab******
c*******
********

I am not sure what would be the best method of selecting points to calculate the normals. If there is a more standardized way picking the points, we can switch to that.

Any help with improving the water is greatly appreciated.

Neopallium commented 1 year ago

Might need to change the shader to match your Rust wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1

The ocean example has debug lines (use feature debug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).

noebm commented 1 year ago

I just tried:

  let b = get_wave_height(w_pos + vec2<f32>(1.0, 0.0));
  let c = get_wave_height(w_pos + vec2<f32>(0.0, 1.0));

I don't notice any difference. I am not an expert in shaders or 3d math. To get the normal I just picked 2 neighboring points (3 including the current frag_coord) and made sure the normal would point above the surface.

The points currently are:

a*******
bc******
********

With 1.0, 0.0 and 0.0, 1.0 the points would be:

ab******
c*******
********

I am not sure what would be the best method of selecting points to calculate the normals. If there is a more standardized way picking the points, we can switch to that.

Any help with improving the water is greatly appreciated.

Usually the idea when calculating the surface normal is to find a vector perpendicular to the plane in the point. You can approximate this by using finite difference to get the tangents and a cross product to find a perpendicular vector. That's why I was confused by that. I too didn't see any difference however and I also haven't written a lot of shader code.

noebm commented 1 year ago

Might need to change the shader to match your Rust wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1

The ocean example has debug lines (use feature debug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).

Afaik the usual method is to use the color output to store the normals, but I can't really make any sense of that when looking at it.

noebm commented 1 year ago

I have a couple of minor simplifications for the shader if you want.

Neopallium commented 1 year ago

Might need to change the shader to match your Rust wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1 The ocean example has debug lines (use feature debug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).

Afaik the usual method is to use the color output to store the normals, but I can't really make any sense of that when looking at it.

Yeah, I have tried that with a primitive mesher (generated mesh), but it isn't very easy to understand.

When I have time I will try creating something like the bevy wireframe system that can render normals as lines over the surface (not just at the vertex points).

Neopallium commented 1 year ago

I have a couple of minor simplifications for the shader if you want.

Go ahead and switch the normal calculation to use the x/y axis, so it matches your wave_normal rust method.

One downside to using SystemParam is that the settings are global. In the future it would be useful to support different water surfaces (inland lakes/rivers vs open oceans) where the settings will be different.

Neopallium commented 1 year ago

Might need to change the shader to match your Rust wave_normal: https://github.com/Neopallium/bevy_water/blob/e36732668e2077591a8b965bc6d81a66c363c3ed/src/param.rs#LL56C1-L56C1 The ocean example has debug lines (use feature debug) that could be used to help check the Rust normals against the shader normals. I haven't found a good method to debug normals in shaders (can't just draw lines, need an extra shader/material).

Afaik the usual method is to use the color output to store the normals, but I can't really make any sense of that when looking at it.

Yeah, I have tried that with a primitive mesher (generated mesh), but it isn't very easy to understand.

When I have time I will try creating something like the bevy wireframe system that can render normals as lines over the surface (not just at the vertex points).

Maybe a quick way to debug them is to use the DebugLines and draw them in a grid across each water quad entity, use the wave_normal method to get the end point for each line.

noebm commented 1 year ago

I also made the wave height modifiable.

I just noticed that large wave heights are somewhat ugly. This fixed by setting WATER_GRID_SIZE to 1. After fixing that the mesh has some artifacts when looking towards the sun in the ocean example. I have a patch that uses the mesh generated by shape::Plane but didn't want to cram it in this PR.

noebm commented 1 year ago

I thought about the wave_height change. Maybe only setting it per Material it a better choice. If you want me to remove it just tell me.

Neopallium commented 1 year ago

I thought about the wave_height change. Maybe only setting it per Material it a better choice. If you want me to remove it just tell me.

For now it is ok to keep it global. Using per material settings would make the Rust-side wave calculation difficult. Most likely will need to add a simple ray-cast system (Could render a low-res top-down view of just the water entities, the color would be the entity ID) to find the water entity below a 3d point.

The global settings and Rust wave_height can be limited to just one "ocean", any customized inland water will just not support the Rust height for now.

Neopallium commented 1 year ago

I also made the wave height modifiable.

Might be better to call it scale or amplitude.

I just noticed that large wave heights are somewhat ugly. This fixed by setting WATER_GRID_SIZE to 1. After fixing that the mesh has some artifacts when looking towards the sun in the ocean example.

With wave_height: 5.0 the Rust-side wave_height() seems to be way off, but that might just be an amplification of the wave error I was seeing before.

I have a patch that uses the mesh generated by shape::Plane but didn't want to cram it in this PR.

I am find with including the switch to using a Plane, it was added in Bevy 0.10.

Neopallium commented 1 year ago

The difference between the rust and shader wave code doesn't seem to be from the time. Pausing the global bevy time, still has a difference between the two. Even without the wave_height change.

noebm commented 1 year ago

I checked it with shape::UVSphere { radius: 0.1, ..default() } and wave_height = 4.0. Seems to be in the order of ~0.2 on my side, not too noticeable for larger meshes.

Neopallium commented 1 year ago

Finally found what is causing the difference between the shader & rust wave height. The mesh doesn't have many vertex points, the shader calculates the height at the vertex and the GPU smooths the height (frag_coord) between vertex points before calling the fragment shader. If we used very small triangles for the mesh the shader & Rust code would become very close. But that would hurt performance. It should be possible to use LOD meshes to help with this. With dense meshes close to the camera.

Changing the WATER_QUAD_SIZE to 1 improves it, but causes some weird breaks in the water surface. This seems to be a render issue. I think the PBR shader code doesn't like something about the generated height/normals.

Neopallium commented 1 year ago

I checked it with shape::UVSphere { radius: 0.1, ..default() } and wave_height = 4.0. Seems to be in the order of ~0.2 on my side, not too noticeable for larger meshes.

Cool, I haven't tried the water on a sphere before.

Neopallium commented 1 year ago

I pushed a few more changes to the water shader. Just updating the y position in the fragment shader using the wave_height. I am not sure if this is better or not. Also maybe the frag_coord also needs to be updated too, but not sure if it can be done in the fragment shader.

I also plan on adding more PBR fields (copied from the StandardMaterial code). Opened issue #10 for that.

noebm commented 1 year ago

I thought about 2 more possible improvements for the normal calculation in the fragment shader.

noebm commented 1 year ago

I pushed a few more changes to the water shader. Just updating the y position in the fragment shader using the wave_height. I am not sure if this is better or not. Also maybe the frag_coord also needs to be updated too, but not sure if it can be done in the fragment shader.

I think using height directly for the world y coordinate might break easily, for example if GlobalTransform is not trivial.

noebm commented 1 year ago

I think there is something wrong with the coordinate frames. For example we are using the normal in the local frame and using it for pbr which expects a normal in the global reference frame. Also the we are using the global position to calculate the normal (which probably should be calculated in the local frame). I might make an example with how this fails when you modify Transform tomorrow.

Neopallium commented 1 year ago

I think there is something wrong with the coordinate frames. For example we are using the normal in the local frame and using it for pbr which expects a normal in the global reference frame. Also the we are using the global position to calculate the normal (which probably should be calculated in the local frame).

By "local frame" you mean local to the mesh? And "global frame" you mean the world?

The normals and wave height should all be done in the global (world) reference. If the wave and normals were calculated using the mesh's local reference, there would be hard seems/breaks between the water tiles.

The normals needs to be in the global reference. Normally the vertex shader converts the mesh/vertex normals to global before they are passed to the fragment shader.

I might make an example with how this fails when you modify Transform tomorrow.

If you can provide an example or just bits of code, I will try to help debug it. There could still be some issues with the water shader or how it is passing info to the PBR shader.

Maybe this is caused by my last commits which updates the y position in the fragment shader. We can remove that if needed.

Neopallium commented 1 year ago

I pushed a few more changes to the water shader. Just updating the y position in the fragment shader using the wave_height. I am not sure if this is better or not. Also maybe the frag_coord also needs to be updated too, but not sure if it can be done in the fragment shader.

I think using height directly for the world y coordinate might break easily, for example if GlobalTransform is not trivial.

We can revert that change if needed. It didn't make a major improvement.

noebm commented 1 year ago

Maybe I just have the wrong idea about what the shader should do.

If you make it local you could do things like use it for spheres or cubes. You would not lose anything and the current variables would just be integrated Transform.

It also behaves well with regards to Camera transforms, i.e. transforming the camera is equivalent to the inverse transform for the water tile.

noebm commented 1 year ago

I pushed a few more changes to the water shader. Just updating the y position in the fragment shader using the wave_height. I am not sure if this is better or not. Also maybe the frag_coord also needs to be updated too, but not sure if it can be done in the fragment shader.

I think using height directly for the world y coordinate might break easily, for example if GlobalTransform is not trivial.

We can revert that change if needed. It didn't make a major improvement.

I have no idea how increased accuracy in the world position affects pbr rendering. Out of interest do you have any reference regarding pbr I could look at?

Neopallium commented 1 year ago

Maybe I just have the wrong idea about what the shader should do.

I had to research water shaders to create the original. I can't remember where the original Godot tutorial is that I first looked at, but here are two I just found that seem very good. They even have some ideas that can be used to improve the water shader.

https://www.youtube.com/watch?v=7L6ZUYj1hs8 https://www.youtube.com/watch?v=XjCh2cN3Mfg

Note they Godot's shader language is glsl, which is different then wgsl. Also each engine provides different globals (TIME vs bevy's globals.time).

If you make it local you could do things like use it for spheres or cubes. You would not lose anything and the current variables would just be integrated Transform.

I haven't tried the shader on anything other than a flat plane. I will try it out.

Ah, thinking about it right now, it most likely is a problem with only applying the wave height to the y position (doesn't matter if it is local/global). For 3d shapes, need to apply the wave height in the direction of the vertex normal. Right now we are completely ignoring the mesh normals, since they are always pointing up (along the y-axis).

Something to try:

let world_normal = mesh_normal_local_to_world(vertex.normal);
out.world_position = world_position + (world_normal * height);

It also behaves well with regards to Camera transforms, i.e. transforming the camera is equivalent to the inverse transform for the water tile.

The water shader should be working ok with the Camera transforms.

Neopallium commented 1 year ago

Something to try:

let world_normal = mesh_normal_local_to_world(vertex.normal);
out.world_position = world_position + (world_normal * height);

Most likely need to also pass the world_normal from the vertex shader to the fragment shader to be used when calculating the fine detail normals in the frag shader. The GPU will blend the values passed from the vertex->frag shader (useful for a sphere to smooth the normal over the surface, not sure how it will help cubes at the edges).

noebm commented 1 year ago

Ah, thinking about it right now, it most likely is a problem with only applying the wave height to the y position (doesn't matter if it is local/global). For 3d shapes, need to apply the wave height in the direction of the vertex normal. Right now we are completely ignoring the mesh normals, since they are always pointing up (along the y-axis).

Yeah that works.

I think I got it working for cubes, but there is still something missing for smooth surfaces. You generally need access to a local (about a point) tangent coordinate system, so you know what the local xz plane is.

Neopallium commented 1 year ago

Ah, thinking about it right now, it most likely is a problem with only applying the wave height to the y position (doesn't matter if it is local/global). For 3d shapes, need to apply the wave height in the direction of the vertex normal. Right now we are completely ignoring the mesh normals, since they are always pointing up (along the y-axis).

Yeah that works.

I think I got it working for cubes, but there is still something missing for smooth surfaces. You generally need access to a local (about a point) tangent coordinate system, so you know what the local xz plane is.

How are you getting the 2d position for the wave_height function for non-flat meshes? With a cube it should work kind of like the plane tiling, still might have issues at the corners.

I just remembered a Youtube video about procedural generating terrains for a sphere (planet). He had to use "Triplanar Mapping" to smoothly apply the terrain generator to the surface of the sphere. I think it was this series of videos: https://youtu.be/QN39W020LqU Also just found this one that looks interesting and also talks about "Triplanar mapping": https://www.youtube.com/watch?v=rNuDkDhadfU

I will need to watch those videos again. Might have remembered the details wrong.

For non-flat surfaces, we can either use a flag to enable something like the "Triplanar mapping", or provide two water shaders.

Other ideas:

  1. Use a texture to provide the 2d wave position and use the vertex UV coords to lookup the 2d point for the wave function. This would be a cheap way to bake in a mapping for 3d surfaces.
  2. Take multiple nearby wave samples and blend the height. A get_wave_height_3d(pos1, pos2, pos3) which calls get_wave_height 3 times.
Neopallium commented 1 year ago

I think it was this video that I had watched: https://www.youtube.com/watch?v=lctXaT9pxA0 The other series by the same person is for Unity. This video is more general.

noebm commented 1 year ago

Thanks, will have to take a look.

How are you getting the 2d position for the wave_height function for non-flat meshes? With a cube it should work kind of like the plane tiling, still might have issues at the corners.

I thought about using the generated tangents (vertex.tangent and bitangent) to somehow get the plane, but I think that might lead nowhere.

I thought it might be easy to implement, but I'm not so sure now. The whole idea doesn't look too useful to me anyway.

noebm commented 1 year ago

It also behaves well with regards to Camera transforms, i.e. transforming the camera is equivalent to the inverse transform for the water tile.

The water shader should be working ok with the Camera transforms.

If you rotate the water tiles so that x or z is constant, the waves will not display correctly. Which is not the same as rotating the camera in the other direction.

Neopallium commented 1 year ago

It also behaves well with regards to Camera transforms, i.e. transforming the camera is equivalent to the inverse transform for the water tile.

The water shader should be working ok with the Camera transforms.

If you rotate the water tiles so that x or z is constant, the waves will not display correctly. Which is not the same as rotating the camera in the other direction.

Can you make some very simple examples that show this?

I also want to make a simple example with the water on a sphere and maybe a curved mesh surfaces. I plan on doing some testing with non-flat surfaces this weekend. The sphere case will have tricky 3d-> 2d mapping, maybe can use the UV coords to help.

Neopallium commented 1 year ago

The "Triplanar mapping" was used for taking a 2d textures (normal map for fine detail and color) and mapping it to the sphere. Something like that could help with blending the wave function, since the sphere surface will not have simple surface to 2d coord mapping needed for the wave function.

noebm commented 1 year ago

Yeah triplanar mapping might be the right method.

noebm commented 1 year ago

Can you make some very simple examples that show this?

Yeah sure.

Neopallium commented 1 year ago

The first video in the older "Unity" series used a different way of creating the sphere mesh: https://youtu.be/QN39W020LqU

It basically sub-divided the faces of a cube (1-256 sub-divisions) and then move all the vertices to sphere radius distance from the center. The main benefit of that is the each face mesh would have simple 2d coords (use the UV coords) for the wave function. It should look about as good as using the water shader on a cube.

Not sure why they didn't use that type of sphere mesh for the newer planet gen video. But it might work well for water.

noebm commented 1 year ago

Put the swap for the rotation on the X key.

use bevy::{input::common_conditions, prelude::*};
use bevy_water::*;
use std::f32::consts::PI;

const WATER_HEIGHT: f32 = 20.0;

fn main() {
  App::new()
    .add_plugins(DefaultPlugins)
    .insert_resource(WaterSettings {
      height: WATER_HEIGHT,
      ..default()
    })
    .add_plugin(WaterPlugin)
    .add_startup_system(setup)
    .add_system(flip.run_if(common_conditions::input_just_pressed(KeyCode::X)))
    .run();
}

fn flip(
  mut dir: Local<bool>,
  mut camera: Query<&mut Transform, With<Camera>>,
  mut tile: Query<(&Name, &mut Transform), Without<Camera>>,
) {
  let rotation = Quat::from_rotation_z(PI / 2.);
  if *dir {
    camera.single_mut().rotation = Quat::IDENTITY;
    for (_, mut transform) in tile.iter_mut().filter(|x| x.0.as_str() == "Water") {
      transform.rotation = rotation;
    }
  } else {
    camera.single_mut().rotation = rotation.inverse();
    for (_, mut transform) in tile.iter_mut().filter(|x| x.0.as_str() == "Water") {
      transform.rotation = Quat::IDENTITY;
    }
  }
  *dir = !*dir;
}

/// set up a simple 3D scene
fn setup(mut commands: Commands) {
  // light
  commands.spawn(PointLightBundle {
    transform: Transform::from_xyz(4.0, WATER_HEIGHT + 8.0, 4.0),
    point_light: PointLight {
      intensity: 1600.0, // lumens - roughly a 100W non-halogen incandescent bulb
      shadows_enabled: true,
      ..default()
    },
    ..default()
  });

  // camera
  commands.spawn(Camera3dBundle {
    transform: Transform::from_xyz(-40.0, WATER_HEIGHT + 5.0, 0.0),
    ..default()
  });
}
Neopallium commented 1 year ago

The issue was with how tiling is done. When tiling multiple water planes, we need to use the world 2d coords for the wave function, but that only works for flat surfaces, where we can just pick two axis (world xz) for the 2d coord of the wave function.

For meshes that don't need tiling we can just use the UV coords (scaled up to whatever size we want). We can add a bool to WaterMaterial to select world/uv coords. Tiling will use the world xz coords, non-tiling can use the UV with a scale factor (the shader doesn't know the mesh size).

Here is a quick change to test it with cubes/spheres:

diff --git a/assets/shaders/water.wgsl b/assets/shaders/water.wgsl
index bffdf4a..24e7afc 100644
--- a/assets/shaders/water.wgsl
+++ b/assets/shaders/water.wgsl
@@ -73,12 +73,14 @@ fn get_wave_height(p: vec2<f32>) -> f32 {
 fn vertex(vertex: Vertex) -> VertexOutput {
   // Need the world position when calculating wave height.
   var world_position = mesh_position_local_to_world(mesh.model, vec4<f32>(vertex.pos, 1.0));
+  let world_normal = mesh_normal_local_to_world(vertex.normal);

   // Add the wave height to the world position.
-  let height = get_wave_height(world_position.xz);
+  let w_pos = vertex.uv * 256.0; // scale UV up.
+  let height = get_wave_height(w_pos);

   var out: VertexOutput;
-  out.world_position = world_position + vec4<f32>(0., height, 0., 0.);
+  out.world_position = world_position + vec4<f32>((world_normal * height), 0.);
 #ifdef VERTEX_TANGENTS
   out.world_tangent = mesh_tangent_local_to_world(mesh.model, vertex.tangent);
 #endif
@@ -90,14 +92,13 @@ fn vertex(vertex: Vertex) -> VertexOutput {
 @fragment
 fn fragment(in: FragmentInput) -> @location(0) vec4<f32> {
   var world_position: vec4<f32> = in.world_position;
-  let w_pos = world_position.xz;
+  let w_pos = in.uv * 256.0;
   // Calculate normal.
   let delta = 0.2;
   let height = get_wave_height(w_pos);
   let height_dx = get_wave_height(w_pos + vec2<f32>(delta, 0.0));
   let height_dz = get_wave_height(w_pos + vec2<f32>(0.0, delta));
   let normal = normalize(vec3<f32>(height - height_dx, delta, height - height_dz));
-  world_position.y = height;

   let color = vec3<f32>(0.01, 0.03, 0.05);
Neopallium commented 1 year ago

Also just inverting the camera rotation doesn't seem to break anything, it is just the mesh rotation that causes problems with the shader.

Neopallium commented 1 year ago

I just did some testing with:

  1. UVSphere - there are breaks at the two poles and along one seam running between the poles.
  2. Icosphere - No vertex breaks, but there is a strip of triangles running between the poles with messed up normals. This sphere looks much better than the UVSphere.
  3. Cube - Doesn't work, no way to subdivide the faces. Bevy's Cube doesn't share the vertices at the corners, so all edges are broken.

The issue with the Icosphere is that when the UV coords wrap around the sphere, they jump from 1.0 back to 0.0. The GPU will smooth the UV coords 1.0->0.0 in those triangles (so you get the full 0.0<->1.0 range squashed into a single triangle.

Below is a change to the shader that fixes half of the squashed triangles on the Icosphere.

The important part is this: var uv = vec2<f32>(abs(vertex.uv.x - 0.5) * 128.0, vertex.uv.y * 256.0); That causes the X coord to reverse halfway around the sphere. (0.0 -> 0.5 reverse -> 0.0) That way when the fragment shader gets back around to the starting point the X UV coord is back to 0.0. There are still some triangles where the X UV coord is 0.0 for the whole triangle, still an improvement.

diff --git a/assets/shaders/water.wgsl b/assets/shaders/water.wgsl
index bffdf4a..dafb651 100644
--- a/assets/shaders/water.wgsl
+++ b/assets/shaders/water.wgsl
@@ -66,38 +66,45 @@ fn get_wave_height(p: vec2<f32>) -> f32 {
   d = d + wave((p - time) * 0.3) * 0.3;
   d = d + wave((p + time) * 0.5) * 0.2;
   d = d + wave((p - time) * 0.6) * 0.2;
-  return water_material.amplitude * d;
+  return water_material.amplitude * d * 0.4;
 }

 @vertex
 fn vertex(vertex: Vertex) -> VertexOutput {
   // Need the world position when calculating wave height.
   var world_position = mesh_position_local_to_world(mesh.model, vec4<f32>(vertex.pos, 1.0));
+  let world_normal = mesh_normal_local_to_world(vertex.normal);

+  //let uv = abs(vertex.uv - 0.5) * 128.0;
+  var uv = vec2<f32>(abs(vertex.uv.x - 0.5) * 128.0, vertex.uv.y * 256.0);
+  let w_pos = uv;
   // Add the wave height to the world position.
-  let height = get_wave_height(world_position.xz);
+  //let height = get_wave_height(world_position.xz);
+  let height = get_wave_height(w_pos);

   var out: VertexOutput;
-  out.world_position = world_position + vec4<f32>(0., height, 0., 0.);
+  //out.world_position = world_position + vec4<f32>(0., height, 0., 0.);
+  out.world_position = world_position + vec4<f32>((world_normal * height), 0.);
 #ifdef VERTEX_TANGENTS
   out.world_tangent = mesh_tangent_local_to_world(mesh.model, vertex.tangent);
 #endif
   out.frag_coord = mesh_position_world_to_clip(out.world_position);
-  out.uv = vertex.uv;
+  out.uv = uv;
   return out;
 }

 @fragment
 fn fragment(in: FragmentInput) -> @location(0) vec4<f32> {
   var world_position: vec4<f32> = in.world_position;
-  let w_pos = world_position.xz;
+  //let w_pos = world_position.xz;
+  let w_pos = in.uv;
   // Calculate normal.
   let delta = 0.2;
   let height = get_wave_height(w_pos);
   let height_dx = get_wave_height(w_pos + vec2<f32>(delta, 0.0));
   let height_dz = get_wave_height(w_pos + vec2<f32>(0.0, delta));
   let normal = normalize(vec3<f32>(height - height_dx, delta, height - height_dz));
-  world_position.y = height;
+  //world_position.y = height;

   let color = vec3<f32>(0.01, 0.03, 0.05);
Neopallium commented 1 year ago

I think it might be possible to fix the UV coords of the Icosphere (Rust-side during mesh generation).

Also the normal calculation in the fragment shader, needs to be updated to work for spheres. I have some ideas on how to fix that.

noebm commented 1 year ago

Nice!

Also just inverting the camera rotation doesn't seem to break anything, it is just the mesh rotation that causes problems with the shader.

Yeah maybe I didn't express myself clearly. I get it in this case but it feels strange to me to have this restriction.

Neopallium commented 1 year ago

Nice!

Also just inverting the camera rotation doesn't seem to break anything, it is just the mesh rotation that causes problems with the shader.

Yeah maybe I didn't express myself clearly. I get it in this case but it feels strange to me to have this restriction.

Supporting mesh transforms can be supported. It might be tricky to get the tiling support to work with transforms.

Neopallium commented 1 year ago

I pushed changes that uses the mesh UV coords instead of world position for the wave height function. Got that to work even for the tiling by adding coord_offset: Vec2 and coord_scale: Vec2 to the WaterMaterial.

Also added the cub, uvsphere, icosphere examples. The cube example is only really useful for testing the normal calculation when the vertex normal is not pointing up.

I didn't notice before, but the Icosphere still has breaks at the poles, that is most likely caused by the X-axis getting squashed the closer it gets to the poles (latitude lines are shorter).

I am not sure if the calculated normals in the fragment shader are correct, but they seem to look ok.

Neopallium commented 1 year ago

Supporting mesh transforms can be supported. It might be tricky to get the tiling support to work with transforms.

Since tiling now uses the UV coords, it works with rotations.

Neopallium commented 1 year ago

This is an interesting video about calculating normals in shaders for sine wave surfaces (Using Unreal's GUI): https://www.youtube.com/watch?v=80vOtsNg-9g

Neopallium commented 1 year ago

I didn't notice before, but the Icosphere still has breaks at the poles, that is most likely caused by the X-axis getting squashed the closer it gets to the poles (latitude lines are shorter).

It wasn't breaks at the vertices/edges, since the water alpha is 0.97, some of the triangles from the opposite side were facing the camera and visible. High amplitude values increased the chance that triangles from the other side are drawn. Simple fix for now is to use alpha of 1.0.