Zylann / godot_voxel

Voxel module for Godot Engine
MIT License
2.6k stars 245 forks source link

The ability for VoxelInstancer to choose an array of mesh or scenes to spawn using one VoxelInstanceGenerator #682

Open Anyeos opened 1 month ago

Anyeos commented 1 month ago

Is your feature request related to a problem? Please describe. I want to spawn a variety of objects (mesh or scenes) on the same spawn point choosing one kind over other in a deterministic manner.

Describe the solution you'd like An array of scenes or mesh with a min / max configuration to trigger the spawning of it.

Describe alternatives you've considered Creating a lot of items with params that hopefully does not overlap with another one.

Additional context Maybe a solution can be implement a new VoxelInstanceLibraryItem class. It can contain an array of "Scene" and an array of "Manual settings" that includes a min and a max value to take in account some output coming from VoxelInstancerGenerator, instead of a boolean value it can return a float between -1.0 and 1.0 just like the noise generators do and we can use that value to trigger the spawning of a kind of instance from the arrays.

Each instance will have a user setteable value of min and max. For example -1.0 as min and -0.5 as max for a kind of tree, and next -0.5 to 0.0 for other kind of tree, and 0.0 to 0.5 for another one and so. That will instantiate different kind of trees but on the same density point as configured in the VoxelInstanceGenerator.

Actually I only can spawn one type / kind of object on the resulting location and is hard to mix for example a lot of trees because they can spawn overlaping. With this method only one will spawn on a same resulting location.

Zylann commented 1 month ago

The way you're supposed to do this currently really is to create separate items in the library. If you want them to use the same generator, save it as a file and share it among those items. But indeed, that means it's harder to guarantee items won't overlap (though even if that feature existed, overlaps can still exist when points are close).

One of the reasons it works that way is because the instancer was primarily designed to spawn lots of multimeshes, like grass, and multimeshes take only one mesh. Those also don't matter wheher they overlap with a rock or not. Then every time you have a different mesh, that means a whole different layer of multimeshes. And the more you add, the more draw calls it creates. And the requirement that they would use the same spawn points but exclusively pick each of them adds even more complexity on top of it. With scenes it sounds easy, but scenes don't scale well in large numbers, and with multimeshes that's significantly more complicated to implement. And I'm not even mentionning the pending possiblity of scripting any of this, and interaction between LODs. The whole plumbing would have to change.

Maybe a solution can be implement a new VoxelInstanceLibraryItem class.

I don't see how that requires a whole different library. A library is just a list of items. It's rather the items that seem to require changes. What you're asking for sounds like an item that contains sub-items, or multiple meshes or scenes (depending on the kind of rendering backend chosen). Note that not everyting requires a scene, and not everything uses a scene internally.

I'm quite confused about how this ties up, it doesnt seem very intuitive. The main thing I take from this, is that you want some kind of way to make "what is spawned" part of what a generator decides, instead of a generator spawning all instances of a single thing. I vaguely thought whether a graph system should be used here, but never elaborated further as I had lots of other things to do.

What you're asking for sounds simple on paper, but goes in a completely opposite way to how things works internally. Which means unfortunately that it requires quite a lot of work, and I can't tell whenever I'll look into that.

Anyeos commented 1 month ago

Is there a way of spawning and handling things directly from a VoxelGeneratorGraph? I mean some way of supplying the information to a GDScript? like signals? or something that can be triggered and execute a function in a script?

Zylann commented 1 month ago

Is there a way of spawning and handling things directly from a VoxelGeneratorGraph?

No, this is even more far away from it. That system has zero knowledge of the other and run at very different stages of the pipeline. Voxel generators work on voxels, while instance generators work on meshes. To give you an idea, for a voxel generator to affect instancing, it would have to output voxel data in a special channel, which would then have to be read by the mesher and somehow stored in vertices, just so the instancer can read that info from the vertices and interpret it like a density or something. And finally that info has to be thrown away because it's useless past this stage, both in the mesh and voxel data; it would occupy lots of memory for no good reason. This is a made-up example, things are a bit more complicated than that and currently there is no way to do it without a custom mesher and a custom instance generator, but you get the idea. Also, this actually seems unrelated to your feature request.

I mean some way of supplying the information to a GDScript?

This is not scriptable currently (because of performance mainly, and the fact the way things work is not really set in stone, especially with your request) so I dont know why GDScript would get involved already. Before things to even become scriptable, deeper changes have to happen.

like signals?

No signals here. Signals for what? For every instance? Imagine that being called for every blade of grass... no way^^"

Something else to note on top of all this, is that the instancer works at different LODs too. It might be workable to have a generator choose between exclusive models for each point it generates within a specific chunk, but that's only for one chunk of a specific LOD. Other chunks of different LOD (larger, or smaller) still generate independently and at different times, placing different kinds of models. What generates on them can still overlap with chunks of different LOD, and if you also dont want that, it's makes things even harder. Generally it's something I thought you'd have to live with, to some degree.

Again what you're asking for requires to change a lot of things at once in order to work together properly. It's not going to happen quickly.

Anyeos commented 1 month ago

My request is to have the ability of choose what to spawn on a resulted location, does not matter how it is done. I am not trying to bother you or something similar, don't get mad please. And what I want is a way of choosing one thing over other in the same location. Only that, I don't know exactly how it can be done efficiently.

I have two workarounds that will work: 1) An empty scene and in a GDScript use a noise generator, use the 3D position as input for that noise, and choose from that result what to really instance() and add_child() for that. 2) In the VoxelInstanceGenerator use same parameters, same Noise, but with different offset for each item.

The 1) will be the expected behaviour as I am requesting. It will choose what to spawn as child of itself on the same location. And here the VoxelInstance dont need to provide anything but only spawn the empty scene in that location. Knowing the 3D position of the scene I can use a Noise to get a value and that value to decide what kind of what I will spawn as child of it. The con is it is some slow / heavy.

The 2) is more fast and because I use the same parameters in a variety of items but with different offset of the noise, it will ensure in some degree that no one will overlap over other. The con is it can overlap eventually but I can adjust the offset and other params to improve the result.

So for now I have something to get what I want.

Zylann commented 1 month ago

My request is to have the ability of choose what to spawn on a resulted location, does not matter how it is done. I am not trying to bother you or something similar, don't get mad please. And what I want is a way of choosing one thing over other in the same location. Only that, I don't know exactly how it can be done efficiently.

I'm not mad, just making it explicit that doing it efficiently is not a simple change (well, not exactly hard, but not something I can do in one evening), and that it could be a while before I look into it. I may have reacted too negatively though, I was in the middle of something complicated, sorry about that.

Your idea 1) is good when you instance scenes. It's simpler than changing the system or even exposing some kind of scripting, because it just does the same thing through the scene system.

Regarding your idea 2), I just wanted to highlight something the generator does: https://github.com/Zylann/godot_voxel/blob/a78c32e57aa5a079886a4bbcde1cd02440a688a6/terrain/instancing/voxel_instance_generator.cpp#L58 For each specific item, a generator always starts by generating a point cloud over a mesh, with a certain density, which is filtered by noises afterwards. When two different items use the same generator with the same settings, they will still use a different seed to generate the initial points, because that seed is a combination of a hash of chunk position and the layer ID (aka the ID of the item). So in theory there is already something that makes overlap less likely. Not sure what kind of offset you're using though. However if you're using a low-quality emission mode, such as "vertices", it will tend to make points themselves overlap even for the same item. Using "emit from faces" would result in better spread.

If I understand correctly, the change you might want would be, instead of each item running its own separate generator, have a way to associate one generator to multiple items (although they would have to be the same LOD). It turns out it might not matter whether things are a scene or multimesh. When a chunk needs to generate, the generator produces a bunch of points in one list, and then they get distributed exclusively between one or more items as multiple lists, based on some probabilities, or a bunch of other noises (if you want different logic that's where you might want scripting, and it would basically give you points and you'd have to decide which ID goes there; though there could be lots of points to go through so I'm always unsure about allowing scripting in computationally intensive areas). So assuming points themselves don't overlap in the first place, they will each spawn a specific model that way, regardless of what the model is (point generation deals with IDs and lists, not the scenes/multimeshes). There is more stuff to figure out though (such as how it's exposed and setup, how it works with partially-edited chunk octants, how it gets multithreaded, and how it turns out in practice) but that's mostly what I'm thinking.

Zylann commented 3 weeks ago

I still havent worked on the code for this, but I had some thinking. I'm starting to have a broad overview on which refactorings to do, which, at first glance, would work in the way you would expect as your issue title describes. However, I'm not yet convinced that it really solves much of what you mentionned afterwards.

Refactoring

The changes I would make, is to rework items generation as a a list of "emitters". Each emitter has one generator and is attached to one LOD level of the terrain. Emitters would then have a list of items, and some configurable logic that decides how the list of items is dispatched on each point. Items would no longer have a generator attached to them.

This change of structure has non-trivial implications under the hood when it comes to identifying layers of items, especially persistent ones (which get saved/loaded as the terrain streams in and out). Things like assigning the same item in multiple emitters, or multiple times the same item into one emitter are examples of edge cases that need to be solved. Because so far, items were a single dictionary under the hood, matching the data structure used by the instancer node internally, with the key being their unique ID. That no longer works with a hierarchy of resources. Of course such hierarchy could be flattened as two lists using IDs instead, but it sounds like that would lead to really poor UX.

The difference it actually makes

Assuming such refactoring is done. What do we actually get from this? The instancer will indeed do what you said: when generating a chunk, it will go through the list of emitters attached to the chunk's LOD level, generate points, and select items for each points, to obtain a list of points per item which later will be used to instantiate scenes or set multimesh positions. Such selection can use the numeric ranges you suggested, or any other custom logic.

However, regarding overlaps... think about what I mentionned here. It seems that refactoring won't really make a difference in the outcome.

Without the change, 2 items with the same generator (using emission from Faces) will each generate their points independently. But because they have different IDs, they will use a different seed, leading to different points being generated for each. Therefore, they will already tend to not overlap. There is no difference between generating points where each chooses between item A or B, and generating points for item A followed by generating more different points for item B. Because both are random. Order doesnt matter. Put it differently, if your goal is actually to avoid overlaps, there is the same chance for points to be generated close to each other, whether they are produced by one pass over a chunk, or two passes over the same chunk but with different seeds. (note: if you manually incremented noise seed between two generators, maybe you accidentally made them the same if item IDs were consecutive? In which case you would indeed get "perfect overlaps", but that's easily solvable by... not changing the seeds, or spreading them apart more).

Also, it doesn't actually makes performance better. I thought 1 generator being used for 2 items saves processing time. But it doesn't: because if we want the same density as the output of 2 generators, we have to crank density up by twice the amount. So the generator ends up doing the same amount of work.

Real problem?

So in light of all this, do you still think you need this change?

It sounds to me that you want this change, because you think it would solve overlapping items. It will not. You will still get overlaps:

However, maybe you have other reasons to request this change?

I'm not really opposed to doing it, because it gets a bit more intuitive to configure other things maybe, but I'm not convinced that it help that much in terms of outcome.

Anyeos commented 2 weeks ago

Hello, sorry for no reading before, I was busy and at the same time I though you are busy too so don't cared about it anymore.

But I need to know how it actually works. Why you said there are no way of avoid overlapping? I think the next: If you have some hash or ID from some place, call it a vertex, you can get the same result with same ID, same parameters... why not? I don't think of not overlapping for example a grass with a tree, not that, but a tree with a tree in the same generator. Because there will not be one kind of tree, there will be a lot of kind of trees that will use one generation and only when it is done (generated) it will not pass again for the same ID / hash / vertex. I don't understand why it need to check again and again the same place? If you said it uses clouds of points, for example for a face? if I don't missunderstood, then the generator will choose a point and that is all. Only a point is choosen, why it will choose another one in the same face?

Really I don't understand how it works and why it is so complicated. I understand that it work in the mesh, but that is like Blender with particles. And in blender there are no overlap, so, I still don't understand why here can be an overlap if we have one generator but a list of items to spawn (like blender can actually do with a collection). Of course, if just a near vertex is choosen and then another vertex that is very near to the last one, it eventually will overlap some, or not, depends the scale, the rotation, etc. I don't mean that, I mean I don't want one tree overlap over other tree because I want to spawn a lot of different trees with the same "spawn logic". Then the generator choose one and only one item for that ID / location / vertex / face result, whathever the location is. Like Blender does.

In Blender I put a particles generator and a Collection as spawn objects. Blender choose one, and only one object from the collection for a desired particle. That is what I want. And for the nature of that, it will not overlap. You don't need to force not overlaping because it will not.

And please don't get me wrong, I'm talking more to myself than to you, because I really don't understand how that can be so difficult to do. Maybe it is difficult and I'm not realizing it. But I'm a little surprised haha. That's why I wrote so directly, but it's not with you.

I really appreciate your project, all the effort you've put in, and it's very well optimized and I wouldn't like to lose that quality. If what I'm asking for is very complicated, I think it would be a good idea to look at it more calmly.

Zylann commented 2 weeks ago

Why you said there are no way of avoid overlapping?

In theory that's not impossible of course. It's just hard to do in every case, in the context of this terrain system.

If you have some hash or ID from some place, call it a vertex, you can get the same result with same ID, same parameters... why not?

You can, but keep in mind this terrain system has LODs. There are multiple layers of meshes at different sizes and different geometry, and their triangles are used to spawn instances in the same areas (you could choose not to, of course. But could be constraining). So picking a "place" is a bit of a challenge, because geometry is different across LODs, and unless your world is flat, triangles don't even have regular sizes (check wireframe view): That sounds easy in 2D when you can pick a grid that you snap on some heightmap, but voxel is 3D, you can't do that here. Also, meshes are required to spawn things. When instances spawn on a chunk, it doesn't have access to meshes of neighbor chunks. Because they might not even exist yet in the first place. Also, if "emitters" are introduced, they can only be defined for a specific LOD, so even if one emitter can spawn a bunch of different instances that never overlap within its LOD, meshes from parent or child LODs are still independently generated at different times with different geometry and therefore might spawn points that are very close. So generally, if overlap avoidance is desired, it has to use approaches that favor parallelism and don't require dependencies between chunks.

I don't understand why it need to check again and again the same place? If you said it uses clouds of points, for example for a face? if I don't missunderstood, then the generator will choose a point and that is all.

What you say would work within a chunk, but there are neighbor chunks too. And parent chunks. And child chunks. They all generate at different times, different threads, and points can end up in the same spot (maybe not exactly, depending on emission mode, but very close).

Only a point is choosen, why it will choose another one in the same face?

Right now a face can be picked multiple times if it is large enough, for the same reason you could have two points close to each other when generating points in something as simple as a rectangle. You could have a whole chunk that is just two large triangles (for example if you crank up mesh simplification), so they have to be covered more to maintain the same density. It really just picks N random points on the mesh, and that point could be anywhere. Even if one point per face was chosen maximum, look again at what wireframe looks like. Some triangles are small enough that you could end up with two points very close even if they use different faces. This is an example of one of the chunks the instancer has to work with (and yet this one is unusually regular, it's not always that forgiving): image

I understand that it work in the mesh, but that is like Blender with particles. And in blender there are no overlap

I'm not sure of that, or you'd have to tell me where you saw this. For particles, they do overlap, and suggestions are the same as what I do: https://blender.stackexchange.com/questions/43485/how-can-i-emit-particles-without-them-overlapping-each-other?rq=1. It depends also on the geometry, if you have it very regular then of course it might contribute, and if you dont randomize them then of course it also contributes, though you get grid-like patterns. And voxel meshes are not like that. There are also modes like "Random" or "Jittered". What the instancer does in "faces" mode is "Random" currently. I'm not sure what "Jittered" actually does, but it seems dependent on the area of triangles. I actually tried these options on an OBJ chunk dumped from Godot, and it had overlapping particles. You'll see another suggestion lower about geometry nodes (poisson disc sampling?), but that's not particles, and that one does have to check every other point when spawning new ones, so really not as simple as a hash. Also it doesn't solve the problem of different chunks, cuz it's Blender, not the same context as a realtime chunked terrain system. Blender can afford focusing on just one mesh and take more time to generate points, while the terrain system has many touching chunks to deal with in realtime on player's computers.

Of course, if just a near vertex is choosen and then another vertex that is very near to the last one, it eventually will overlap some, or not, depends the scale, the rotation, etc. I don't mean that, I mean I don't want one tree overlap over other tree because I want to spawn a lot of different trees with the same "spawn logic". Then the generator choose one and only one item for that ID / location / vertex / face result, whathever the location is

The first cause you mention is often what will lead stuff to overlap (if you emit from faces), so what you say here is a bit contradictory.

Here are some details:

You can see the code for each mode here: https://github.com/Zylann/godot_voxel/blob/fa92fc3a874c27df5842251e9801d3f740a37287/terrain/instancing/voxel_instance_generator.cpp#L94

Blender choose one, and only one object from the collection for a desired particle. That is what I want. And for the nature of that, it will not overlap

Already said earlier why that won't work as reliably here; assuming you mean "it chooses one object per triangle/vertex", look at the wireframe of voxel meshes. They are not like what you'd model in Blender.

Overall, there might be some tweaks to improve the situation, maybe by using a combination of your proposal and different emission modes, it's just not easy to do it very reliably. (also not forgetting the other aspects of such a change, apart from overlaps, which still have to be fully figured out in terms of implementation; because to be fully supported, it really changes a lot of the internal logic and how things are exposed)

Note: if you'd like to discuss more details in voice/screenshare I'm available on the Discord today 26/018/2024 (or later days, but only after 6pm UTC).

Zylann commented 2 weeks ago

First prototype:

image Here points generated by a single generator are dispatched equally into 3 different multimeshes per chunk.

It's in the instancer_emitters branch. It breaks compatibility (will see about that later). It probably has bugs. May need manual refresh sometimes. Also, persistent items are broken (the change of structure means identifying what is what in saved chunks became more complicated, so for now I worked around it but it's not reliable). Things aren't set in stone, I just tried implementing something until I get the concept working.

Side note: triangles the instancer has to work with image

Anyeos commented 1 week ago

Hello, how are you? I have an idea that would be more useful to me. Anyway, I don't think it's a bad idea to have a list of items to instantiate instead of just one item per generator. But I've solved it by instantiating a scene where I choose to instantiate another scene and that way I'm already selecting a list. The speed is very fast, so I don't see any disadvantages. I've managed to instantiate hundreds of trees with just one scene without noticing a delay.

Something I want to clarify is that I'm developing video games and that's why I need something practical and that works. I'm not here to ask for tastes, these are things I need for my next projects.

I don't care if it's perfect, if it fails a little and some overlaps occur, but what I do care about is that it's not so obvious.

Suggestion: New Emit Mode: "OneByFaces" -> #695

Edit note: I implemented it by myself so don't worry it is already implemented. A question is if you want to see it I can make a fork and put that code there.