Closed bugracoskun closed 1 year ago
Interesting, I think the safest fix would be to skip inclusion of the VLR record if the ground control point file is too large and emit a warning.
It makes sense. Will it change the accuracy of the data if we don’t use VLRs?
Nop.
Hello again. I tried without --writers.las.vlrs
parameter (I commented this parameter), and it worked !
We can delete this parameter for the process or can control the file size of the GCP file. I get this error with 24KB gcp_list.txt file. I dont know the limit of the file for control. We can try to calculate maybe.
Also I can create a PR depending on decision.
Best would be to skip inclusion of the VLR record if the ground control point file is too large (more than 65535 bytes according to PDAL docs?).
I think this is not size of the our file. PDAL creates VLRs values with our file which new data. As I stated 24KB gcp_list.txt file returns with error. As far as I think it is necessary to determine the limit size for our file.
Hello, I'm having the same problem, when I run the process without GCP, with the GCP file the same message appears, there are 21 GPS points, the drone used is a Phantom 3 Pro, approximately 2500 images. Could you help me to solve this, I'm using WebODM Natively, how do I skip the step? Excuse my English, I'm from Brazil. thanks in advance for the help!
Hello again, I'm still having this unresolved problem, I tried to remove the --writers.las.vlrs parameter, and another different failure appeared, in another test with only 5 points in the GCP file I was successful in the processing, I created another file with other points and I was successful again now with 6 points, when I create a larger file the error remains.
[INFO] running pdal translate -i "E:\WebODM\resources\app\apps\NodeODM\data\33f214b3-9265-4206-9048-3d7c4f3ee797\odm_filterpoints\point_cloud.ply" -o "E:\WebODM\resources\ app\apps\NodeODM\data\33f214b3-9265-4206-9048-3d7c4f3ee797\odm_georeferencing\odm_georeferenced_model.laz" ferry transformation --filters.ferry.dimensions="views => UserData" --filters.transformation.matrix="1 0 0 569132.0 0 1 0 7469006.0 0 0 1 0 0 0 0 1" --writers.las.offset_x=569132.0 --writers.las.offset_y=7469006.0 --writers.las.scale_x=0.001 --writers.las.scale_y =0.001 --writers.las.scale_z=0.001 --writers.las.offset_z=0 --writers.las.a_srs="+proj=utm +zone=22 +south +datum=WGS84 +units=m +no_defs + type=crs" --writers.las.vlrs="{\"filename\": \"E:/WebODM/resources/app/apps/NodeODM/data/33f214b3-9265-4206-9048-3d7c4f3ee797/odm_georeferencing/ground_control_points .geojson\", \"user_id\": \"ODM\", \"record_id\": 1, \"description\": \"Ground Control Points (GeoJSON)\"}" PDAL: writers.las: Can't write VLR with user ID/record ID = ODM/1. The data size exceeds the maximum supported
Hello, what is the size of your gcp_list.txt file? Can you check it?
Hi my friend, the size of my gcp_list.txt file is 32KB, I'm in another attempt to process without the --writers.las.vlrs parameter, I performed a new test without the parameter, it worked, but I tested this one with less pictures , I left the total running now with all the photos, return after finishing.
32KB is too large. Can u test it with 10-15 kb around. You need to delete half of them.
Hello everyone, ODM is broken on large GCP files. I used 17 gcp point and then error. But wehn i tried the without gcp it is working.
I searched the error of the pdal. And i found it, it just supports for 65535 bytes for version between 1 and 1.3. It must be used v1.4 for large gcp files.
Pdal use version 1.2 as a default.
It can be changed in ODM to version 1.4 after some tests.