Closed shashank-iitbhu closed 5 months ago
This can be reproduced by running:
python3 src/scribe_data/extract_transform/wikidata/update_data.py '["German"]' '["nouns", "verbs"]'
@andrewtavis Are you able to reproduce this issue?
If so, I can open a PR with the proposed changes.
I can confirm on my end, @shashank-iitbhu:
json.decoder.JSONDecodeError: Invalid control character at: line 320797 column 115 (char 6713171)
Two questions to decide on this:
LIMIT
and OFFSET
I'm a bit worried that it might not work as we'd need to compute the full result in order to then offset for the ones we want?
All in all great that you figured this out and suggested solutions! As you can see by the verbs queries, this is not the first time that this has happened 😅
Confirmed from the Wikidata team that splitting based on nouns and proper nouns would be the initial path forward, but offset could work if it continues to be problematic :)
Note that I just tried to query only all the singulars of only the nouns, not the proper nouns, and it's still failing. At this point it might make sense to use LIMIT
and OFFSET
.
CC @daveads, do you want to write in here so I can assign this issue to you?
yup @andrewtavis
Lots of commits above, and post a discussion with a Wikidata admin today I was able to get it working with changes to the query itself. This issue has been great for Scribe as it really unveiled a lot of parts of the queries that weren't necessary and were slowing things down. Note on this as well: maybe something to consider if a query stops is to remove the labeling service if it's been used, as there is a lot of overhead to get it to run over the results of a large query.
With the above being said, there have been lots of improvements, and I'm super grateful to @shashank-iitbhu for opening this and to @daveads for all the conversations that led to the possible solutions that led us to here! 😊 Thanks so much!
Terms
Behavior
Description
Query builder Link
The query time limit is reached that's why
results = sparql.query().convert()
inupdate_data.py
is throwingjson.decoder.JSONDecodeError
due to Invalid control character at: line 320797 column 115 (char 6713171) insparql.query().response
as it contains the timeout error logs.Suggested Changes
Query timeout limit reached
error as total number of nouns and pronouns for German are165869
. Verified here.LIMIT
andOFFSET
to split into multiple queries.