yt-project / yt_idv

Interactive volume rendering for yt
Other
9 stars 6 forks source link

BlockCollection scaling issue #45

Closed chrishavlin closed 1 year ago

chrishavlin commented 2 years ago

For datasets with large code_length, the grid traversal in BlockCollection can have some scaling issues, leading to views that do not make sense....

For example:

import yt
import yt_idv

ds = yt.load_sample("GasSloshing")
rc = yt_idv.render_context(height=800, width=800, gui=True)
sg = rc.add_scene(ds, "dens", no_ghost=True)
rc.run()

Sets up a very confusing viewpoint:

snap_0000

This happens because:

  1. the dataset has a large range
    ds.domain_width
    Out[6]: unyt_array([7.40544e+24, 7.40544e+24, 7.40544e+24], 'code_length')
  2. camera position and clipping planes are scaled to unitary values
  3. when the grid traversal in BlockCollection pulls LeftEdge and RightEdge those are also in code_length

So you end up with a view of a very small region...

The fix might be as simple as using the scale attribute when instantiating the BlockCollection, but we should double check that it is working as expected and maybe add some logic for when to set scale=True (should it always be set to True?).

Example successfully using scale:

import yt
from yt_idv import render_context
from yt_idv.scene_data.block_collection import BlockCollection
from yt_idv.scene_components.blocks import BlockRendering

ds = yt.load('flash_idv_tests/m1.0_p16_b2.0_300k_plt50/multitidal_hdf5_plt_cnt_0200')
ad = ds.all_data()

rc = render_context(height=800, width=800, gui=True)
sg = rc.add_scene(ds, None)

# manually build the block collection, with scale=True
ad_block_data = BlockCollection(data_source=ad, scale=True)
ad_block_data.add_data("density", no_ghost=True)
# now a rendering context with the block collection
ad_block_rendering = BlockRendering(data = ad_block_data)

# add it to the scene
sg.data_objects.append(ad_block_data)
sg.components.append(ad_block_rendering)

rc.run()

snap_0000

chrishavlin commented 2 years ago

It occurred to me that the scale attribute might only work when the data_source is the full ds.all_data(). Here's the loop that re-scales when BlockCollection.scale=True:

        if self.scale:
            left_min = np.ones(3, "f8") * np.inf
            right_max = np.ones(3, "f8") * -np.inf
            for block in self.data_source.tiles.traverse():
                np.minimum(left_min, block.LeftEdge, left_min)
                np.maximum(right_max, block.LeftEdge, right_max)
            scale = right_max.max() - left_min.min()
            for block in self.data_source.tiles.traverse():
                block.LeftEdge -= left_min
                block.LeftEdge /= scale
                block.RightEdge -= left_min
                block.RightEdge /= scale

Siince it only normalizes the edges by the blocks contained within data_source, I suspect that there would be some mismatch with the unitary scale of the dataset. Could/should we simply pass in the full ds.domain_width to BlockCollection?

chrishavlin commented 2 years ago

thought some more about the last comment -- maybe scaling the data_source from 0 to 1 given the data_source bounds is actually the correct way so that the selection always fills the screen coordinates. So maybe everything the scaling actually works as intended!

chrishavlin commented 2 years ago

I'll look back at this soon to see if it's actually an issue or not...