Closed thompson318 closed 4 years ago
In GitLab by @ThomasDowrick on Nov 21, 2018, 15:39
With the disclaimer that I only have a beginner's knowledge of tensorflow, I have a few thoughts:
Logistically, if there is someone who is available and willing to start working on a new library, then I'd be fine for them to start trying. Between the three of us, there is probably enough to do in the meantime polishing/completing the existing list of packages before we start on new ones.
In GitLab by @MattClarkson on Nov 21, 2018, 16:17
I'll ask Bongjin, and see if there is a need, if not, we can park it for now.
@BongjinKoo
In GitLab by @BongjinKoo on Nov 22, 2018, 09:25
All issues raised above are valid. And, I am worried that this might become just like NiftyNet, which is a(n unnecessary) wrapper around TensorFlow (TF) that not many people use (from my perspective). More so, given that we usually find a (state-of-the-art) network implementation, from the authors, TensorFlow, or somewhere else, for our purpose, and extend/modify it. So, I think we can just let the users work on integrating their TF code into SNAPPY as Tom said. At least for now.
In GitLab by @MattClarkson on Nov 22, 2018, 11:22
unassigned @StephenThompson
In GitLab by @MattClarkson on Nov 22, 2018, 11:22
OK. Nice to have discussion. Leave it for now. I'll close, we can re-open if necessary.
In GitLab by @MattClarkson on Nov 22, 2018, 11:22
closed
In GitLab by @MattClarkson on Nov 21, 2018, 14:51
@StephenThompson @ThomasDowrick - this ticket to discuss how to integrate learned models into SNAPPY.
So, in my simple view of the world, you'd have:
So, a question to SNAPPY developers - shall we get someone to lead this, as in practice we have enough to do. Also, the mechanics of training, so some harness to run on a load of data could be quite involved. It could also be quite specific to each dataset. So, Im not sure how scaleable it would be. But in the first instance, Bongjin has trained MaskR-CNN on Eli's old data set so we have an exemplar on something to start with.