jiyuuchc / lacss

A deep learning model for single cell segmentation from microsopy images.
https://jiyuuchc.github.io/lacss/
MIT License
31 stars 4 forks source link

Unable to understand several harcoded variables in the training example notebook #6

Closed ajinkya-kulkarni closed 9 months ago

ajinkya-kulkarni commented 1 year ago

Hello, thanks for this repo! Unfortunately, I am trying to train this model on my custom dataset, and I am unable to understand several variables that are there. For example, what is 0.4 and 128 in:

cfg = {
  "backbone": {
    "drop_path_rate": 0.4
  },
  "segmentor": {
    "instance_crop_size": 128
  }
}

and if I don't want augmentations, can how I switch it off in these lines? If I simply comment out the part for augmentations, how are the locations defined?

def parser(data):

    image = data['image']
    label = data['label']
    locations = data['centroids']

    height = tf.shape(image)[0]
    width = tf.shape(image)[1]

    # simple augmentations
    if tf.random.uniform(()) >= 0.5:
        image = tf.image.flip_left_right(image)
        label = label[:, ::-1]
        locations = locations * [1, -1] + [0, width]

    if tf.random.uniform(()) >= 0.5:
        image = tf.image.flip_up_down(image)
        label = label[::-1, :]
        locations = locations * [-1, 1] + [height, 0]

    # It is important to pad the locations tensor so that all elements of the dataset are of the same shape
    n_pad = 512 - len(locations)
    locations = tf.pad(locations, [[0, n_pad], [0,0]], constant_values=-1)

    return (
        dict(
            image = image,
            gt_locations = locations, 
        ),
        dict(
            gt_labels = label,
        ),
    )

Thanks, Ajinkya

jiyuuchc commented 1 year ago

Model hyperparameters

cfg = {
  "backbone": {
    "drop_path_rate": 0.4
  },
  "segmentor": {
    "instance_crop_size": 128
  }
}

drop_path_rate: is the dropout probability of stochastic depth -- a technique used in training very deep NNs. See https://arxiv.org/abs/1603.09382v3 for technical explanation.

instance_crop_size: is the maximum area of analysis of the segmentation head. For efficiency reasons, the segmentation head does not explicitly compute for pixels too far away from the predicted cell centroids and simply assume these are background pixel. This setting allows much faster computation. But if set too small, may result in clipping of the segmentation. See https://arxiv.org/abs/1603.09382v2 for a more detailed explanation.

Augmentation

Yes you can turn on/off augmentation as you see fit.

"locations", or cell centroids, ie, data["centroids"], were computed from the user provided labels. Implementation is in the function lacss.data.dataset_from_img_mask_pairs

You can easily compute these values yourself if you opt to write your own data pipeline code from scratch.

ajinkya-kulkarni commented 1 year ago

Hi, thanks for the reply. Is there a chance a minimal training notebook would be available for training using images and masks? I was not able to turn off the augmentations part unfortunately.

jiyuuchc commented 1 year ago

By "mask" I assume you mean instance-level segmentation label, right?

For this type of training, the best example code is the "supervised training" demo notebook linked at the front page.

Alternatively, you can look at the code in /experiments/livecell/supervised.py for setting up training using coco-style labeling.

Also I can be of more help if you provide more specific info about your problem: What is your error message? Can you share your current code and results?

ajinkya-kulkarni commented 1 year ago

Ah thanks, I will look up the /experiments/livecell/supervised.py script and see if I can translate that in my work. I will also post a minimal example next week, so the errors will be clear. Thanks again!

github-actions[bot] commented 10 months ago

This issue is stale because it has been open for 60 days with no activity.

github-actions[bot] commented 9 months ago

This issue was closed because it has been inactive for 30 days since being marked as stale.