AllenInstitute / ophys_etl_pipelines

Pipelines and modules for processing optical physiology data
Other
9 stars 5 forks source link

Re-run segmentation example experiments #312

Closed danielsf closed 3 years ago

danielsf commented 3 years ago

Once #294 is done and we are storing ROIs directly in the HDF5 output file, we should re-run our 18 example experiments (or the data cube delivered by the science team, if that is available at this time) to both

For reference, the ophys_experiment_ids of the example experiments we have been using are

experiment_ids=(785569470
785569447
788422859
795012008
795011996
788422825
795897800
795901895
795901850
806862946
803965468
806928824
951980473
1048483611
951980484
1048483613
1048483616
850517348)

Tasks

Optional

Validation

djkapner commented 3 years ago

copy of email to Pika:

Hi All,

I wanted to document some steps to take to look at the latest results from segmentation.

If you have any trouble running this notebook, let me know so I can fix it.

Dan

djkapner commented 3 years ago

And the slurm script

#!/bin/bash
#SBATCH --job-name=ophys-segmentation
#SBATCH --mail-type=NONE
#SBATCH --ntasks=24
#SBATCH --mem=140gb
#SBATCH --time=04:00:00
#SBATCH --output=/allen/aibs/informatics/danielk/deepinterpolation/logs/segmentation_%A-%a.log
#SBATCH --partition braintv
#SBATCH --array=12

export TMPDIR=/scratch/fast/${SLURM_JOB_ID}
image=docker://alleninstitutepika/ophys_etl_pipelines:14d0157a589fc3b4e5055fed63709a783070b6bf

eids=(
785569470
785569447
788422859
795012008
795011996
788422825
795897800
795901895
795901850
806862946
803965468
806928824
951980484
1048483611
1048483613
951980473
1048483616
850517348
)

expdir=/allen/programs/braintv/workgroups/nc-ophys/danielk/deepinterpolation/experiments/ophys_experiment_${eids[$SLURM_ARRAY_TASK_ID]}
video=${expdir}/videos/deep_denoised.h5
graph=${expdir}/backgrounds/deep_denoised_filtered_hnc_Gaussian_graph.pkl
logpath=${expdir}/djk_sep01_assessment.h5

SINGULARITY_TMPDIR=${TMPDIR} singularity run \
    --bind /allen:/allen,${TMPDIR}:/tmp \
    ${image} \
        /envs/ophys_etl/bin/python -m ophys_etl.modules.segmentation.modules.feature_vector_segmentation \
        --graph_input ${graph} \
        --video_input ${video} \
        --log_path ${logpath} \
        --n_parallel_workers 24 \
        --seeder_args.keep_fraction 0.1 \
        --seeder_args.n_samples 24
djkapner commented 3 years ago

for the filter step, I used the same slurm script as above, but added:

SINGULARITY_TMPDIR=${TMPDIR} singularity run \
    --bind /allen:/allen,${TMPDIR}:/tmp \
    ${image} \
        /envs/ophys_etl/bin/python -m ophys_etl.modules.segmentation.modules.filter_z_score \
        --min_z 2.0 \
        --graph_input ${graph} \
        --pipeline_stage post_detect_filter \
        --log_path ${logpath}
djkapner commented 3 years ago

This sqlite file now has the detect and filters steps for all 18 experiments:

sqlite_path = Path("/allen/aibs/informatics/segmentation_eval_dbs/djk_sep01.db")
djkapner commented 3 years ago

List of experiments and notes. Comparing LIMS legacy valid and current state of detect + zscore filter valid.

Deepscope SLC

New one clearly an improvement over legacy. Will need classifier to clean some up.

3P SLC

MESO SST

MESO VIP