Open tischi opened 2 years ago
nope :( happy to put together figures/schematics though if you have any drafts you'd like me to work on?
In my talks I usually show below two slides.
Then in the first slide I tell the audience that they should imagine that it starts raining, and where the rain cannot decide into which valley to flow, that's the watershed line.
Does that inspire you? :-)
The allegory with water I do not get it. Nuclei 2 will fill before nuclei1 ? I would say when water of nuclei 2 starts mixing with water from nuclei1 this is the ridge.
Without any filling! Just imagine rain falling from the top. Into which valley would it flow?!
But I think your version gives the same result...
something like this could work? @manerotoni @tischi
I like it @stemarcotti
Great would be to see an animation on how for a real object the basins are defined. I look up if I can create something in Matlab,
I meant this idea: https://en.wikipedia.org/wiki/Watershed_(image_processing)#Watershed_by_the_drop_of_water_principle
To my understanding those are simply local maxima (local ridges in 2D). For this explanation you do not have to wait for any basin to fill. You can immediately when it starts to rain identify the places that divide the landscape into which basin the water will flow.
Indeed there also is this idea: https://en.wikipedia.org/wiki/Watershed_(image_processing)#Watershed_by_flooding If we use that I think we should draw examples into the image where the waters indeed meet.
You do put up a wall at a certain point and this is the ridge. Where to put the wall is the whole point of watershed.
Cell pose training data set (not sure if it is complete) https://www.cellpose.org/dataset
I put this together in ppt format, please let me know if you'd like anything changed! :)
Looks beautiful! As mentioned before, I am not sure the picture on the very right is correct. To me it suggests that one can actively put a watershed somewhere, which I don't think is the case. I think watershed algorithms just automatically find watershed that are there anyway. What do you think?
hmmm I think the algorithm (not the user) does actively build the water barrier somewhere, hence why it makes sense to me.. However, I do see your point, as where to build the barrier will depend on the chosen algorithm, and ultimately on the original shape of the overlapping blobs. Would you have any idea for a representation of the "wall" that would match your idea better?
I can't think of a better idea than what I have drawn above:
The watershed are the locations that split whether the rain goes into a different basin. This also appears to the be common language definition:
okay, let me think of something, I'll get back to you! :)
@dlegland The output of a watershed in MLJ is a label mask image with boundaries. Would you say this is typical?
@volkerH what do you get in skimage?
Actually we followed the convention from Matlab, that considers bounday between pixels, and this was rather convenient. We added the option "without dams" in MLJ, but I think this may induce ambiguities in the choice of the basins.
May have to check, but maybe the original definition of watershed is the set of pixels between the basins (the "crest").
@tischi, I can only refer to the excellent documentation which includes an example.
According to the above skimage's watershed returns a label image by default and if you pass the optional watershed_line=True
argument, the labels will be separated by a 1-pixel wide line with label 0 (background)
When I see the diagrams above, it is very close to the watershed in earth sciences, i.e. where you have rain from above filling up the basins.
For marker-controlled watershed I have a different mental picture, in which you have a surface that contains holes drilled at the markers. You now submerge that surface into a tub with water and the basins start to fill from the holes. You could imagine a dye attached to each hole that colours the water flowing in through that hole. Whereever two different colors mix, you create a watershed boundary.
For me, the rain-from-above filling the watersheds is not very intuitive when you control with markers as it is not obvious where you start building the walls (i.e. you would have to know where the walls have to be build for the rain going down the correct side before the water level reaches that point).
@dlegland I get a somewhat disturbing result here (probably some sort of "numerical instability"?)
I get this
yes, strange...
on which image did you apply the watershed? When working with distance maps, you usually need to invert (or complement) thes distance map.
There is a slide here: https://f1000research.com/slides/9-1378 (slide 44)
Here is the macro:
/*
* Shape watershed (with distance transform) in Fiji
*
* Requirements:
* - IJPB-Plugins update site
*/
run("Close All");
setOption("BlackBackground", true);
open("https://github.com/NEUBIAS/training-resources/raw/master/image_data/xy_8bit__touching_objects_same_intensity.tif");
rename("input");
// create mask
run("Duplicate...", "title=mask");
setThreshold(83, 255);
run("Convert to Mask");
run("Chamfer Distance Map", "distances=[Chessknight (5,7,11)] output=[16 bits] normalize");
rename("dist");
run("Invert");
// watershed without mask
run("Classic Watershed", "input=dist mask=None use min=0 max=255");
// watershed mask
run("Classic Watershed", "input=dist mask=mask use min=0 max=255");
So, yes, I do invert the distance map... @manerotoni what did you exactly do?
I just applied the Distance transform Watershed. Plugins > MorphoLibJ > Binary > Distance Transform watershed Not step by step. Ideally it should do the same.
It depends on the example I placed now two circles and it worked as expected. If you have no mask it just gives the separation boundary. The question is why in one case it finds 3 objects and in the other case it find 2 objects.
with @tischi script gives
@tischi you can have an idea of how many basins will be created by checking the number of regional minim (Plugins -> MorphoLibJ -> Minima and Maxima -> Regional Min & Max). If you choose connnectivity 8, this corresponds to the "use diagonal" option of the watershed.
Here I get three minima -> three basins. Without hte mask, the basins extend outside of the original binary region, and thenyou have to use binary combination with the mask.
As I understand @dlegland the distance metrics you choose matters.
Using
run("Chamfer Distance Map", "distances=[Chessboard (1,1)] output=[16 bits] normalize");
Gives a more intuitive result.
The question is why in one case it finds 3 objects and in the other case it find 2 objects.
It will completely depends on the minima; however they are not easy to identify by eye (the values are very similar...).
One "trick" to solve this is to apply smoothing -> this will tend to merge neighbor minima.
Another way is to compute so-called "extended minima", using a tolerance on the value between two minima. And this is exactly what is implemented in the "Morphological Segmentation" plugin!
As I understand @dlegland the distance metrics you choose matters.
yes, because the metric will change the values of the distance map. But I have few feedbacks about this...
@tischi now after this short discussion I think that the concept in the distance transform module should be changed.
The distance transform requires a definition of a metric, connectivity, etc. and this can affect the results of subsequent steps.
The distance transform requires a definition of a metric, connectivity, etc. and this can affect the results of subsequent steps.
For the metric, definitely yes! And I agree the choice of the metric will impact following steps.
For the connectivity, I think it is not related to the distance definition. Connectivity is necessery when we want to define ("connected!) regions, such as in Connected Component Labeling, minima or maxima detection, watershed (because basins are connected)...
Blurring the distance transform fixes the issue:
The teaching module becomes more complex adding the blur but I think maybe that's just life. Maybe it is actually "typical practice" to smooth a distance map before using it as an input to a watershed? At least, I usually do this....
@tischi why don't you use a different metrics? Chessboard gives the expected result.
Maybe it is actually "typical practice" to smooth a distance map before using it as an input to a watershed? At least, I usually do this....
I believe also it is common practice to smooth distance maps before watershed. Part of the expert knowledge never written down explicitely...
why don't you use a different metrics? Chessboard gives the expected result.
I think there is a kind of compromise between the regularity of the transform, and the quality of the approximation of the bundary. With chessboard metric on the above example, the line is quite polygonal...
thanks everyone for a very useful recap of watershed!! @tischi would you find this more convincing? Partially used @VolkerH idea, but not sure I got it right :)
Are the two images that you are using already part of the teaching material? If not, could you please send me the raw data, then I could add it (even better, you could add them yourself, do you know how)?
👍 I think I know how to add the raw images, I'll work on this
❤️ please follow our current naming convention (all lower case, technical and descriptive information separated by double underscore), e.g. for your data you could:
xy_8bit__few_separate_nuclei.tif
xy_8bit__few_touching_nuclei.tif
https://github.com/NEUBIAS/training-resources/tree/master/image_data
@tichi updated the figure, the ppt and the images in the watershed branch (pull request: https://github.com/NEUBIAS/training-resources/pull/297)
Thanks a lot! Lovely!
I would compress the line profile a bit and I would remove the "1 Intensity 0" LUT bar as this is intuitive and probably the values in fact are not going from 0 to 1 if it is an 8-bit image 😉
What about something like this?
In fact, also for the distance transform, you have to invert it before the watershed idea works. Could you add the inverted distance transform into the lower workflow?
To really make the point one could also add a line profile across the two nuclei in their inverted distance transform, because now, 🧙, there are two basin with a clear watershed!
something like this? I attach here the ppt and png, if you want me to update them on the repo just let me know :)
I think that's great! We can always come back to it and change stuff later. Please add it to the repo!
@stemarcotti I was thinking what I would tell students going through the figure and I realized that it is not so simple, because, while be basin are well visible, the actual watersheds are not so obvious here (especially between obj1 and obj2):
So I tried with your data where the watershed would be:
open("/Users/tischer/Documents/training-resources/image_data/xy_8bit__few_separate_nuclei.tif");
run("Invert");
run("Gaussian Blur...", "sigma=5");
run("Classic Watershed", "input=xy_8bit__few_separate_nuclei.tif mask=None use min=0 max=255");
run("glasbey_on_dark");
So it is quite in the middle between obj1 and and obj2, but the location is not obvious from just looking at it, because it is so flat between the two objects. I wonder whether we should use a simpler example for the explanation figure?
If the nuclei were closer together it would be more obvious where roughly the water-shed would be.
Also note that one needs to smooth the image, otherwise one gets this:
So maybe a smoother image with nuclei that are closer together would be a simpler example for beginners?
I made a module draft: https://neubias.github.io/training-resources/watershed/index.html
@tischi I see your point, but I go through a slightly different thought process. The first question to me is "are the items well separated? If it rains and the basins fill up, do I obtain clearly separated wells?" If yes, end of story, I don't need to get into the watershed; if not, then I have to figure out how to divide the basins, so I use the watershed and define the ridge between the items.
Regarding the watershed applied on the image, these are the commands I would run (I need to blur the image to start with though)
run("Gaussian Blur...", "sigma=2"); run("Convert to Mask"); run("Distance Transform Watershed", "distances=[Borgefors (3,4)] output=[32 bits] normalize dynamic=1 connectivity=8");
This said, if you want me to run the example on a different image and re-make the figure there's no problem! Are there any good examples that could be used in the [image_data] folder?
Ah, now I see.... So your point is that in the first part of the figure we don't need a watershed?
yes :)
I am not sure I 100% agree with below statement (or at least "the basins fill up" may need to be clearer defined):
If it rains and the basins fill up, do I obtain clearly separated wells?
Have a look at the Activity here: https://neubias.github.io/training-resources/watershed/index.html
There are three clear basin (aka local minima), but you cannot segment the objects with a single threshold => watershed algorithm may be needed.
hmmm, no it still does make sense to me, the basin on the right fills up and there's only one and it's well separated (=no worries, no need for a watershed), the two on the left fill up but it's unclear how to separate them (=need a watershed). Anyway, it might be just semantics! If you want me to use this example in the figure that's fine :)
MorpholibJ: Classical and seeded watershed.
Seeded watershed example: nuclei with cell boundaries (where are the CellPose training data)? Can we use data from Magdalena? (@maulakhan) CellCognitions datasets...(@manerotoni)
@stemarcotti do you have teaching material for watershed?