Closed brisvag closed 2 years ago
Merging #122 (69ca615) into master (b51d3e6) will decrease coverage by
0.86%
. The diff coverage is83.37%
.
@@ Coverage Diff @@
## master #122 +/- ##
==========================================
- Coverage 84.24% 83.38% -0.87%
==========================================
Files 93 93
Lines 2101 2112 +11
==========================================
- Hits 1770 1761 -9
- Misses 331 351 +20
Impacted Files | Coverage Δ | |
---|---|---|
blik/datablocks/__init__.py | 100.00% <ø> (ø) |
|
blik/datablocks/abstractblocks/__init__.py | 100.00% <ø> (ø) |
|
blik/datablocks/abstractblocks/metablock.py | 100.00% <ø> (ø) |
|
blik/datablocks/abstractblocks/multiblock.py | 70.58% <ø> (ø) |
|
blik/datablocks/multiblocks/__init__.py | 100.00% <ø> (ø) |
|
blik/datablocks/multiblocks/particleblock.py | 66.66% <ø> (ø) |
|
blik/datablocks/multiblocks/transformblock.py | 31.25% <ø> (ø) |
|
blik/datablocks/simpleblocks/__init__.py | 100.00% <ø> (ø) |
|
blik/datablocks/simpleblocks/lineblock.py | 96.77% <ø> (ø) |
|
blik/datablocks/simpleblocks/propertyblock.py | 52.77% <0.00%> (ø) |
|
... and 91 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 1f0fa4b...69ca615. Read the comment docs.
Yeah it must be working because I can load a whole dataset without killing my PC :P But yeah, da.from_array
specifically works well with np.memmap
!
As mentioned in #120, dask is a better way to handle the lazy loading in napari. This PR changes all our lazy loading to use dask and simply let napari handle the work.
It works, but there's an issue: when using dask, contrast limits are automatically set to
0, 1
by napari. I'm not sure how to get around this... Is there a reasonable guess of the data range with mrc data, for example?