-
Den digitala kompetensen måste finnas hos dom som skapar motioner/lagar. Idag formuleras riksdagsdokumenten som när de sedan blir realiserade blir bara trams data... det finns ingen vision om att följ…
-
The image and vernacular names on OneZoom come from EoL. The harvester that pings the EoL API needs a re-write, but we may also want to get images and vernacular names from wikimedia commons. In https…
-
This is half a note to myself, about turning this codebase, or at least the CLI portion, working for at least one non-English Wikipedia (Hebrew). Currently the CLI does not crash, but extracts zero co…
-
This issue is a quick follow up from discussion about reconciliation with @wetneb prompted by a closer look here [#2083](https://github.com/OpenRefine/OpenRefine/issues/2083). Reconciling for external…
-
Sometimes OpenRefine generates Wikibase issues of this form:
> **[male population (P1540)](http://www.wikidata.org/entity/P1540) is missing a [P585](http://www.wikidata.org/entity/P585) qualifier.*…
-
# Archipelago 2020 Roadmap
See also #5
This is our working enumeration of concrete tasks for this year, per Component and Service, for public evaluation (critics and comments welcome).
All …
-
It would be great to be able to get entities with a SPARQL query e.g.
```
SELECT ?station WHERE {
# P954 = Internationale Bahnhofsnummer IBNR
?station wdt:P954 ?ibnr.
# filter to show only …
-
Hi! I have this error for several entities that I am trying to import from wikidata:
wikibaseintegrator.wbi_exceptions.MWApiError: 'The supplied language code was not recognized.'
Or the followi…
-
Wikidata is a structured database and it's therefore easier to parse Wikipedia infoboxes.
Have a look for example to the list of entities including a SMILES code:
https://www.wikidata.org/wiki/Specia…
-
[Lingua Libre ](https://lingualibre.org/wiki/LinguaLibre:Main_Page) is an online collaborative project and tool by the Wikimedia France association, which aims to build a collaborative, multilingual, …