Open julianzz98 opened 1 year ago
Can you provide more details for that operation that took so long for that particular 850 MB dataset? For example used type of dataset/scheme (if possible/public), used database, was the dataset compressed and options of the loader command line would help to describe the issue better if it may is related to the data.
In the past I had similar issues and created a (partial) workaround (in #1454) for issue #1451, did you use the flag -skipReferenceCheck=true
in your operation loading these 850 MB?
If this was not used in your process it would be interesting to know if this option would make a notable difference in time.
This may also be related to:
Thanks for providing the existing references on the topic. I was planning on providing more details and fill the issue later today. Will comment here again once I am done!
@stephanr I updated the description of the issue with the needed information of usage as well as the dataset. If I repeat the operation, I will make sure to try out the flag -skipReferenceCheck=true
!
While using the GmlLoader of the deegree GML tools CLI, it came to my attention that the GmlLoader gets slower the more features it is processing.
The used dataset:
Description of the dataset (german-only):
Command using the deegree SqlFeatureStoreConfigCreator:
Command using the deegree GmlLoader:
End of the GmlLoader logs (after it successfully ran):
Observation: The final import of the data takes around ~4,5 hours, as seen in the logs.