Hi, when using pctrain with 1.65GB .laz file as an input for training, it it is Starting resolution, jumps to Init scale ....and after short time it outputs Killed. (see attached screen dump). Are there any limitations on file size? Any paramaters we we can adjust to use large sized input files for pctrain?
We are running it as docker image which is deployed on VM servere with Red Hat Enterprise Linux 8.9 (Ootpa) operating system, 64GB RAM, Intel(R) Xeon(R) Platinum 8358 CPU @ 2.60GHz (16 cores)
Changing the starting resolution will lower memory requirements. 1.65 GB are a lot of points. You could also consider splitting the input in multiple files.
Hi, when using pctrain with 1.65GB .laz file as an input for training, it it is Starting resolution, jumps to Init scale ....and after short time it outputs Killed. (see attached screen dump). Are there any limitations on file size? Any paramaters we we can adjust to use large sized input files for pctrain?
We are running it as docker image which is deployed on VM servere with Red Hat Enterprise Linux 8.9 (Ootpa) operating system, 64GB RAM, Intel(R) Xeon(R) Platinum 8358 CPU @ 2.60GHz (16 cores)