-
At a [hackathon](https://www.wikidata.org/wiki/Wikidata_talk:Scholia/Events/Hackathon_October_2024) this weekend, @WolfgangFahl has set up a local QLever instance based on the current Wikidata dump. I…
-
When downloading the latest Wikidata dumps we find that the download speed is very slow. After analyzing in detail this aspect and even running parallel download we see that the smokes are downloaded …
-
@TallTed
Tim Holzheim has successfully imported Wikidata into a virtuoso instance see https://cr.bitplan.com/index.php/Wikidata_import_2024-10-28_Virtuoso and
https://wiki.bitplan.com/index.php/W…
-
Hello @debayan , I am trying to reproduce your result on LC-QuAD 2.0 on a Wikidata dump dated 13 October 2021. However, I'm finding it difficult to set up the virtuoso endpoint and am facing similar p…
-
Have you considered supporting SDC (Structured Data on Commons) yet? It is the wikibase of metadata regarding images within Wikimedia Commons, and it uses Wikidata items and properties as its vocabula…
-
It might be useful to know how many taxa have an image change when comparing an older (filtered?) wikidata JSON dump to a new one. This would give us some idea of the "normal" amount of extra harvesti…
-
The Wikidata dump became very big with 1.2 billion statements which makes the initial loading of the bz2 dump into lmdb particularly slow.
To speed-up this step, we could try:
- instead of havi…
-
Dear author. I tried running the `build_wikidated_v1_0.py` script, but I encountered the following error. Could you help me check what's going wrong?
```
$ python build_wikidated_v1_0.py
2023-09…
-
Hi,
Thank you for the useful github code.
When I run the code in preprocess_dump.py to process the lastest wikidata dump (as of April 16) with 28 processes, I got the following error with proces…
-
The wikidata data set is too large. Can the existing official API not meet the reproduction requirements of this work?