Random scripts used for doing mass imports to Wikidata. The emphasis is on sourcing Claims whether new or pre-existing. As such any imported statement will follow the following decision tree (any added statement includes the source):
For details on how qualifiers are handled, see wikidatastuff.wikidata_stuff.WikidataStuff.match_claim()
.
For details on how sources are compared, see wikidatastuff.reference
.
wikidata_stuff.py
:
reference.py
: A class representing the source claims.qualifier.py
: A class representing qualifier claims.statement.py
: A class representing a statement (i.e. value, qualifiers and references).wikidata_string_search.py
: A database hookup (to be run from Toolforge) for
doing text string searches (SQL LIKE style) in labels, aliases and
descriptions of items.wdqs_lookup.py
: A module for doing WDQS look-ups
and for converting (some) WDQ queries to WDQS
queries.preview_item.py
: Allows for the visualisation of a prepared new/updated Wikidata
item candidate. An item candidate consists of a dict of label/aliases (per language
code), a dict of descriptions (per language code), a dict of Statement
s (per
P-prefixed property-id), an optional Reference
(used whenever one is not included
in a Statement
) and the itemPage
to which the information should be written
(if not a new item). The visualisation takes the form of a wikitext table
(example).For usage examples see lokal-profil/wikidata_batches.
Note that these may be using older versions of the repo.
To run as a different user to your standard pywikibot simply place a
modified user-config.py
-file in the top directory.
To use a different user for a particular mass import place the user-config.py
in the subdirectory and run the script with -dir:<sub-directory>
.
Deprecated functions, classes and arguments may be dropped at any major
version change. To display the deprecation warnings run your script using
the pywikibot -debug
flag (you must be making use of
pywikibot.handleArgs()
) or add the -Wd
option to your python call.