tkeskita / BVtkNodes

Create and execute VTK pipelines in Blender Node Editor
GNU General Public License v3.0
111 stars 19 forks source link

Mesh Motion Blur with VTK data #108

Open dwerner95 opened 8 months ago

dwerner95 commented 8 months ago

Hey @tkeskita,

i am currently working on a project for which i must generate images from simulations that should be as realistic as possible, therefore I need some motion blur.

I tried playing around with it without success.

My current workflow is quite simple:

image

plus the time selector

I understand that the motion blur system in blender works by interpolating the positions of the mesh between frames, however, technically this mesh is not moving, it is generated new in each frame and there is no way for blender to figure out how the movement was done. I read that you were talking about something motion blur related 2 years ago, was there any success with this?

Thank you, Dominik

tkeskita commented 8 months ago

Hi Dominik,

from you node setup I see you try to visualize particles, so you're after motion blur for particles, I assume. I haven't really pursued motion blur after 2020, and I don't know if Blender nowadays has this capability via Python. I tried a quick search on devtalk.blender.org but did not find anything new. I sort of expect this would be possible via Blender simulation nodes, but I haven't followed up with that. If anyone finds a way, please comment below!

The way I can think of to do this now is to use object keyframing: skip BVTKNodes and use custom Python to create every particle as a single object and then set keyframes for each object according to particle velocity. However, this will not work for even a "medium" number of particles, since Blender will become quickly too slow with many objects in a scene.

dwerner95 commented 8 months ago

Thank you @tkeskita.

I am wondering if it may be possible to extract the particle velocity and add them to the "Velocity" vector for the "Vector Blur Node" in the compositor. Could that be a workaround?

The Blender internal simulations support such a feature.

dwerner95 commented 8 months ago

Not sure if it helps, but the Blender Add-on "VisualSPHysics" allows motion blur from vtk data. see code here

Do you think one could implement that? I would try and do the work (in November). But it would be good for me to know if this is even feasible.

tkeskita commented 8 months ago

Hi,

it seems that VisualSPHysics uses Shape Keys for saving velocity of each vertex to get motion blur. A very nice find! I think it could work! For BVTKNodes, it would only require that you specify name of the VTK vector point field to be applied for the vertex motion blur. Logical place for this entry would be the VTK to Blender Mesh node.

Please go ahead! I'll wait for you to give it a try.

BR, Tuomo

dwerner95 commented 8 months ago

Quick update here: it works!🎉 output

Unfortunately, I’ve run into a problem when I turn on automatic node update for rendering animations in Blender. If motion blur is switched on with a shutter setting of 1, Blender seems to load data from the previous and current frames. This ends up overwriting the keyframes I've set up, which causes some odd artefacts.

I’ve tried using bpy.app.handlers.frame_change_pre and frame_change_post to figure out if there’s a way to tell when a node update is happening because of motion blur rather than for a main frame. But I’m a bit stuck on this.

Do you have any advice or know if there's a method to detect this? Any help would be great!

tkeskita commented 8 months ago

Hi,

great, image looks awesome! Do you have your code already in github to view it?

I didn't quite follow what exactly happens in you problem, can you please elaborate step by step? Does shutter value 0.99 work fine? What are you doing with keyframes? What kind of odd artefacts?

BR, Tuomo

tkeskita commented 7 months ago

Hi @dwerner95

I'm planning to add this Shape Key motion blur method to BVTKNodes during xmas holiday break. Could you please share your code with me so I can work on that? Code does not need to be super clean, I can modify it as necessary. Thanks!

BR, Tuomo

dwerner95 commented 7 months ago

hey @tkeskita, apologies for the silence on here. i will upload it to my branch over the weekend and let you know!

dwerner95 commented 6 months ago

Hey @tkeskita!

I've pushed to code to a new branch in my Fork (see here). Would you like me to open a pull request?

The code itself works as long as you want to generate a static frame. If I include a TimeSelector and automatic frame updates, it breaks down due to the way motion blur works in blender. I try to explain it again in a way i understand it, and that might be wrong: Blender typically achieves motion blur by analyzing the movement of objects across frames. It does this by examining not only the frame it is currently rendering but also the frames immediately before and after it. This technique allows Blender to understand the trajectory and speed of moving objects, enabling it to create an interpolated, blurred effect that represents motion.

However, when using BVTKNodes, a complication arises. The act of "looking ahead" to the next frame inadvertently triggers BVTKNodes to load new data from VTK files. Furthermore, if BVTK-motion blur is enabled, this process also resets certain keyframes that are essential for creating the BVTK- motion blur effect. As a result, the frame generation gets disrupted, leading to a chaotic output instead of a smoothly rendered motion blur.

Ideally, to address this issue, there would be a way to distinguish whether Blender is processing a frame as the primary frame to be rendered or as an auxiliary frame for calculating motion blur. This differentiation would prevent BVTKNodes from unnecessarily resetting keyframes or loading new data when it's only rendering frames for motion blur. Unfortunately, I haven't been able to find such a flag or feature in Blender that provides this specific information.

I am keen to see how we could get around this issue.


Another thing worth mentioning: When using a large timestep (Shutter time) in the node, blender crashes on my Mac M2.

I got different errors depending on the shutter time: CommandBuffer Failed: cycles_metal_integrator_queued_paths_array and CommandBuffer Failed: cycles_metal_integrator_intersect_closest

This is not a problem for me, as those timesteps are way to large anyways, however it might be for other people.

Apologies again for the delay. November was busier than anticipated!

Best wishes,

Dominik

tkeskita commented 6 months ago

Hi, thanks for the branch! I'll check it up during holidays.

tkeskita commented 6 months ago

Hi @dwerner95 please create the pull request for this.

tkeskita commented 6 months ago

Hi @dwerner95

I spent today looking at this. I've made a lot of modifications to your code while debugging. Would it be OK for you if I don't use your commits (= no need for pull request, since I don't merge your branch to master), but instead mention your contribution in my commit as a comment? Or, I can wait for your pull request, accept it, and then modify, either way works.

At least one bug I've found is that the values for shape key layer must be co + vel*timestep and not just vel like it seemed to be in VisualSPHysics code..

Update: I found out that if you use motion blur Position type "Start on Frame", then the frame does not change during rendering of animation and you get correct motion for the blur for the frames!

BR, Tuomo

dwerner95 commented 6 months ago

Hey @tkeskita!

Good to hear that it works! I am ok with you uploading it yourself! Thank you so much for your work!

Beat wishes, Dominik

tkeskita commented 6 months ago

Hi, I've now committed a patch for this, let me know if you find issues! Thank you as well! I had already given up hope to get motion blur working, this is such an awesome feature to have for surface meshes! Hoping to find a way to get this also for VTK To Blender Particles node some time!

BR, Tuomo

dwerner95 commented 5 months ago

Hello @tkeskita,

Thanks for implementing this feature. It's performing excellently for individual frames on my end; I haven't attempted to use it for animations yet.

I have a query regarding the computation of the shape-key mesh with the following code:

bv[key_shape] = bv.co + 2.0 * motion_blur_time_step * Vector(array_data[i])

My question is about the use of the factor of two. If I'm not mistaken, the time step for motion blur is supposed to match the camera's shutter speed. Wouldn't multiplying it by two essentially double this value? Could you explain the reason for this choice?

Best Wishes, Dominik

tkeskita commented 5 months ago

Hi @dwerner95 , I used factor two because key frame value zero is set to nFrame-1 and value one to nFrame+1. Setting zero value to nFrame did not work, although that would be most realistic, as the particle positions correspond to nFrame.

I now think maybe that failure is related to non-linear default key frame value interpolation, that should be checked. Please leave this issue open until this is confirmed.

tkeskita commented 5 months ago

Hi @dwerner95,

I tested briefly and it looks like I could get some motion blur even in animation after changing line https://github.com/tkeskita/BVtkNodes/blob/master/converters.py#L736 to kb.keyframe_insert("value",frame=nFrame). However, the default interpolation is Bezier type (at least it is shown so in Blender Graph Editor), so need to find out how to change that to linear. Oddly enough kb.interpolation says linear, so I'm a bit perplexed about how this thing is working. Needs some more testing to find out.

tkeskita commented 4 months ago

Ok I found a way to change the interpolation to linear, so I removed the velocity scaling. Key frames are now set to current and next frame. Please let me know if there's still issues thanks!