Closed pierredittgen closed 3 years ago
the write function is a wrapper around the wbeditentity API call in wikibase. As such it is not possible to tweak it to dissect the query in manageable parts.
There are some hacks that could apply here, though.
Initially, I was actually building bots this way, i.e. statement by statements, qualifier by qualifier, reference by reference. Not long after starting doing so, I got some pushback for a large number of API calls for a single edit item. It was in this interaction that I learned about the wbeditentity call that submits a single item for upload in contrast to the many for a single statement. In your case, the 2000+ quantities would lead to 2000+ API call's and if those included references and qualifiers, even more.
Long story short, cutting big queries into smaller ones can lead to a huge increase in API load.
I also don't know if the allowed size is also a threshold when split in multiple API calls. If I am not mistaken, a wikibase item is a json blob stored in a relational database. If the limit is set on the full size of the json blob, your only option is to increase that limit, or restructure the semantic model (a.k.a splitting the item in to multiple items)
Thanks @andrawaag for this explanation and these suggestions.
I use wikidataintegrator
0.8.7
Trying to add a 2000
WDQuantity
(same property, only value differs) to an existing item, I encounter a problem onitem.write()
:I understand that the query hits my wikibase limits. I could increase
maximum allowed size
on the server side but it won't solve the problem if the data to send still increase...I wonder if
wikidataintegrator
library provides a way to deal with big insertion queries like this, e.g. using an iterator on data and cutting big queries into smaller ones to ensure staying within the limits of wikibase?