Closed gmadevs closed 3 years ago
Hi Gma, Could you please provide a bit more information? E.g. could you confirm that you work with fetal brain MRI? If yes, how do the (automatic) brain segmentations look like? If the segmentations look acceptable and the stacks are of reasonable quality, you should get good reconstructions.
In general, if there is no substantial motion, 3 approximately orthogonal stacks are good enough to get decent results (assuming your acquisition protocol is similar to what I have seen previously). If you had more data, then this would be better but 3 are typically okay. See here for a comparison of the reconstruction outcome for different input data scenarios. A further example is shown in the supplementary material.
I work with fetal MRI, acquired with slice thickness of 3mm and 10% gap. The automatic brain segmentation looks good but in a lot of patients (approximately 50%) i'm not able to get a segmentation and the process is killed. Could it be related to my actual PC setup? I'm running a virtual machine with 24gb ram. If positive do you have any experience of using a cloud computing GPU service (such as amazon?)
Killed processes typically indicate that the VM doesn't have enough RAM allocated. If you have 24GB ram already allocated to it in the VM settings > System > Motherboard then it's a little strange though. Have you tried using the Docker image for comparison?
Alternatively, for the specific support relating to brain segmentation, please reach out to the team there: MONAIfbs. NiftyMIC just integrates it.
I'm using with great interest your tool using the virtual machine distribution. I'm currently having trouble reconstructing a good image since a lot of my results are far from optimal. I'm using now 3 single stacks (one axial, one coronal and one sagittal). I was wondering if it is recommended to use more than one stack per orientation or the tool performs better given only 3 stacks. Also how does this impact the performance on my machine? I'm running on a 32gb i7 computer.
Thank you very much!
Gma