Closed bdionne closed 2 years ago
This was tested in 4.0 subsequently (indexing approach on batches & others was modified). The batch load had 250K terms, and into a DB with ~300K concepts. It ran uninterrupted for 31 hrs and completed. This size stretches the owlbinary-based backend as well as the full-vocab-in-memory approach. Closing with "wontfix" but keep in the project so it remains accessible.
Files associated with this ticket in private data repo:
Some notes: