alicevision / Meshroom

3D Reconstruction Software
http://alicevision.org
Other
11.09k stars 1.08k forks source link

[bug] duplicate node then delete data destroys data from original #604

Open Baasje85 opened 5 years ago

Baasje85 commented 5 years ago

Describe the bug Then duplicate node (from here) is executed, then on the duplicated path delete data (from here) the original data is deleted. Obviously this can be explained due to the fact that the hashes are not changed, but this just lost me a day of computation.

To Reproduce Steps to reproduce the behavior:

  1. right click on a processed depth map in the graph editor
  2. duplicate depthmap from here
  3. right click on the duplicated node, delete data (from here)
  4. notice that the green bars of the entire graph is deleted not just the duplicated subgraph

Expected behavior Make a distinction between "the data" and "the function that is cached" versus "the node" that gives access to it. Do reference counting for the cache.

Screenshots image

Desktop (please complete the following and other pertinent information):

natowi commented 5 years ago

This is not a bug, but a useful suggestion to improve the user experience. Using "delete data (from here)" from a duplicated graph is not necessary, as you are expected to change parameters. (Why would you want to delete data for a duplicated node if you did not change any settings?)

For each node a folder with hash is generated based on the settings. Nodes with the same settings and origins use the same cache folder (to reduce required disk space and computations).

Proposed solution: When clicking on "delete data" mr111 add a notice that the data for all nodes with the same settings will be deleted. "This will also delete the data for FeatureExtraction {list of affected nodes}"

Baasje85 commented 5 years ago

(Why would you want to delete data for a duplicated node if you did not change any settings?)

Because a duplication would suggest the program duplicating the contents of the nodes > 10GB data.

For each node a folder with hash is generated based on the settings. Nodes with the same settings and origins use the same cache folder (to reduce required disk space and computations).

Proposed solution: When clicking on "delete data" mr111 add a notice that the data for all nodes with the same settings will be deleted. "This will also delete the data for FeatureExtraction"

I would suggest reference counting (at least within the same project). This would delete data for the following nodes "present a list of nodes sorted by name".

natowi commented 5 years ago

(Why would you want to delete data for a duplicated node if you did not change any settings?)

Because a duplication would suggest the program duplicating the contents of the nodes > 10GB data.

In Meshroom Duplicate node(s) duplicates Nodes in the Graph view with their settings/following connections. This behaviour is not uncommon and can be found in Blender for example. This detail could be added to the doc.

I would suggest reference counting (at least within the same project). This would delete data for the following nodes "present a list of nodes sorted by name".

Reference counting would be indeed useful to manage node variations https://github.com/alicevision/meshroom/issues/592#issuecomment-521387744 https://github.com/alicevision/meshroom/issues/361

natowi commented 5 years ago

There will be some improvements with the next release: "If a nodes in a branch have the same values as a node in another branch than an icon will be displayed to indicate that this node is a duplicate. When you hover over the icon a tooltip will be displayed with the name of the original node." https://github.com/alicevision/meshroom/pull/612

stale[bot] commented 4 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

natowi commented 3 years ago

dup2JPG