gerlichlab / looptrace

Fork (from EMBL Gitlab) of the looptrace project: https://git.embl.de/grp-ellenberg/looptrace
MIT License
2 stars 1 forks source link

Parameter inference from .nd2 metadata parsed during conversion to .zarr #164

Open vreuter opened 11 months ago

vreuter commented 11 months ago

Newer version: here's what we could infer

Earlier version:

This would share code from image_io.stack_nd2_to_dask and image_io.stack_tif_to_dask Then zarr_conversions could be parsed directly from config, obviating the need for building an ImageHandler in convert_datasets_to_zarr.py

vreuter commented 9 months ago

NB: the .nd2 metadata parse accompanies the conversion to .zarr. This is at the beginning of the pipeline and could thus inform all subsequent steps.

vreuter commented 9 months ago

Here's an example metadata:

P0001.zarr/.zattrs
{
    "metadata": {
        "channel_0": {
            "emissionLambdaNm": 700.5,
            "excitationLambdaNm": null,
            "name": "Far Red"
        },
        "channel_1": {
            "emissionLambdaNm": 630.0,
            "excitationLambdaNm": 561.0,
            "name": "Red"
        },
        "microscope": {
            "immersionRefractiveIndex": 1.515,
            "modalityFlags": [
                "fluorescence",
                "camera"
            ],
            "objectiveMagnification": 60.0,
            "objectiveName": "Plan Apo \u03bb 60x Oil",
            "objectiveNumericalAperture": 1.4,
            "zoomMagnification": 1.0
        },
        "voxel_size": [
            0.3,
            0.107325563330673,
            0.107325563330673
        ]
    },
    "multiscales": [
        {
            "axes": [
                {
                    "name": "t",
                    "type": "time",
                    "unit": "minute"
                },
                {
                    "name": "c",
                    "type": "channel"
                },
                {
                    "name": "z",
                    "type": "space",
                    "unit": "micrometer"
                },
                {
                    "name": "y",
                    "type": "space",
                    "unit": "micrometer"
                },
                {
                    "name": "x",
                    "type": "space",
                    "unit": "micrometer"
                }
            ],
            "datasets": [
                {
                    "coordinateTransformations": [
                        {
                            "scale": [
                                1.0,
                                1.0,
                                0.3,
                                0.107325563330673,
                                0.107325563330673
                            ],
                            "type": "scale"
                        }
                    ],
                    "path": "0"
                }
            ],
            "name": "seq_images_zarr_P0001.zarr",
            "version": "0.4"
        }
    ]
}
vreuter commented 9 months ago

Note also that this metadata is available for each FOV, and therefore would work nicely even if we ultimately entirely parallelise across FOV (#75 )