-
As we have support for different backends, it would probably be a good idea to have unit tests to verify that we achieve the same results using the three different backends: pytorch, numpy, and tensor…
-
We could use [Vahadane](https://staintools.readthedocs.io/en/latest/_modules/staintools/normalization/vahadane.html).
Or [Reinhard](https://staintools.readthedocs.io/en/latest/_modules/staintools/…
-
We currently have a dependency on [spams](http://spams-devel.gforge.inria.fr/) which we only use in one place (Vahadane stain normalization). We should remove this dependency if possible because it ca…
-
Hi @ncoudray,
I am working on using the vahadane normalization and i don't seem to understand where and how have you chosen the reference image? I read the paper and didn't find it there too. Was t…
-
We could replace spams lasso with a small NN based mapping to the concentrations. That should speed up the augmentor. What do you think @Peter554 ?
-
2020-03-18 10:41:48,544 - INFO - /S-190413-00241/vHnE/VHNE_S-190413-00241_R001.tif OpenSlide does not support the requested file. Import this from openslide rather than from openslide.lowlevel. …
-
I read the paper and found we need to scale stain concentration in VahadaneNormalizer.
But maybe, it doesn't implemented in this library, right??
I think, we should fix `fit` and `transfer` meth…
-
Which one of stain normalization algorithms is fastest? Besides the methods have implemented here.
As I use the Vahadane's method, it cost 1.4s for a 512x512 image. However, I need to run a stain n…
-
Had a few issues while using this library. They are as follows:
* Installation issue(spams error)
* Running stain normalization on a batch if 15000 images resulted in the Negative optical density …
-
Hello @Peter554,
I get a `Floating point exception (core dumped)` when I'm using `transform(img)` from `staintools.StainNormalizer(method='vahadane')` with `stain_normalizer.fit(i1_standard)` previou…
EKami updated
5 years ago