Closed neilknowscomputers closed 3 months ago
So I was able to get this to run after upping the machine resources to
It would be great to know what was needed (or if there was a way to configure how much memory is used)
@neilknowscomputers That’s great! Our inference process only uses one GPU and for the example case to be run, a 12G GPU should be good. In my experience, a NVIDIA 3080 Ti works for most cases I dealt with. You can also use google Colab T4 GPU to run the example (there is a colab example under doc in our repo that you can take a look at). But the exact memory requirements depends on your task (tile size and area coverage). Glad to know that it is working on your end. Let me know if you have any further questions in this regard, or we can close this issue.
I close this issue. Feel free to re-open if you have any other questions.
While running
example.sh
I received a segmentation fault error.This was the output
Is this due to a lack of memory? Stack size? I'm running on a
g3.4xlarge
which has 122 GB of CPU memory and 8 GB of GPU memory. Do I need more?I tried setting
ulimit -s unlimited
and got the same result. Any ideas? 🤔Thanks for the help ❤️