-
when I just tried to create the latesst wikidata dump hdt I stumbled over the following triple:
----------- onliner.nt ------------
` .`
----------------------------------
When trying rd…
-
I tried to parse a dump of some wikipedia pages with XmlProvider, but no matter what I try, I get a
`System.OutOfMemoryException`. Is there some guidance/pattern on how to parse large files with typ…
-
Even though the WMF does not make it easy to check for these dump files it is still possible to check, or at least try..
I my application at https://github.com/wikimedia/analytics-wmde-toolkit-analyz…
-
During processing of a recent wikidata dump I noticed that WbGlobeCoordinate throws an ArgumentException if the precision field is null. Here is the relevant portion from an example entity (Berlin TV …
-
Wikidata is adding a "mul" language, to be used for labels, in particular, when many languages have the same label for an item.
I think that this means that using @en@rdfs:label will not work well,…
pfps updated
2 months ago
-
I would like to share some of the dumps that I have created with wdsub. How can I do it?
Maybe creating a folder in some place with external read-access?
By the way, the reason is that I have c…
labra updated
3 years ago
-
Thanks for this great tool!
I would be interested in generating a dump of all wikidata items, except those which have P31:Q13442814. It's not clear to me if this is doable yet?
-
It could aid with search and rendering.
https://qrank.wmcloud.org/
The deliverable is a .csv.gz file currently at 100 MB.
-
2024-08-28 14:47:35.804 - INFO: QLever IndexBuilder, compiled on Tue Aug 27 16:54:45 UTC 2024 using git hash ac257c
2024-08-28 14:47:35.804 - INFO: You specified the input format: TTL
2024-08-28 14:…
-
Available here:
https://codeberg.org/elevont/lov-dump
I did this, because for software relying on this, it comes in handy to be able to include it as a git sub-module, instead of having to downloa…