Open brett--anderson opened 1 year ago
I tried running again after upgrading KGTK to 1.5.2. I used exactly the same command except this time I set -procs
to 1
.
I had several KGTK processes, one holding 96% or so of the memory and occasionally flaring up to 100%. Then rather than the silence of before it actually threw and error:
2850000 lines processed by processor 0
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/site-packages/kgtk/cli/import_wikidata.py", line 2636, in run
pp.add_task(line)
File "/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/site-packages/pyrallel/parallel_processor.py", line 383, in add_task
self._add_task(self.batch_data)
File "/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/site-packages/pyrallel/parallel_processor.py", line 388, in _add_task
self.mapper_queues[0].put((ParallelProcessor.CMD_DATA, batched_args))
File "/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/site-packages/pyrallel/queue.py", line 671, in put
raise ValueError("DEADLOCK IMMANENT: qid=%d src_pid=%d: total_chunks=%d > maxsize=%d" % (self.qid, src_pid, total_chunks, self.maxsize))
ValueError: DEADLOCK IMMANENT: qid=3 src_pid=3146: total_chunks=4 > maxsize=3
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/site-packages/kgtk/exceptions.py", line 70, in __call__
return_code = func(*args, **kwargs) or 0
File "/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/site-packages/kgtk/cli/import_wikidata.py", line 2732, in run
raise KGTKException(str(e))
kgtk.exceptions.KGTKException: DEADLOCK IMMANENT: qid=3 src_pid=3146: total_chunks=4 > maxsize=3
DEADLOCK IMMANENT: qid=3 src_pid=3146: total_chunks=4 > maxsize=3
I tried again with two processes. Same result as my first post. I did notice that the two processes spawned to do the work eventually became zombies, from top
command:
3263 ubuntu 20 0 0 0 0 Z 0.0 0.0 19:19.40 kgtk
3264 ubuntu 20 0 0 0 0 Z 0.0 0.0 29:54.03 kgtk
Hi Brett, while I am trying to see who can help with this, we made the imported data for Wikidata 2022-11 available here: https://kgtk.isi.edu/#/data Perhaps that helps you, for now, to proceed with your work?
Hi! Thanks for that link, I'll try using the pre-processed version and that should get me unstuck for now. Thanks!
Hello everyone!
I have a similar problem. After about one hour of execution, my Mac shut down unexpectedly. This if the maximum number of processors is used. If, instead, I maintain 6 as this number, I have a similar problem to what was reported.
My question is the following: I am giving a look at the wikidata kgtk files provided by @filievski. Where can I find the node.tsv, edge.tsv, and qualifier.tsv files?
I can see many files and this is a bit unintuitive.
I come back to this issue.
After several tries, I launched the import in a cluster of my affiliation, with 32 cores and 250 GB of RAM.
I launched the same command of this issue, but with --procs
set to 32.
At some point, the program stops giving outputs. The first time, I set a time limit of one day. The job got stuck before that time. On the second try, I put a time limit of one week. Similar to the first trial, the program got stuck. After one day it was stuck, the node in the cluster crashed, probably due to finished memory.
Could you kindly explain as one could have this wikidata successfully imported?
Please, also explain to my previous comment.
Hi Brett, while I am trying to see who can help with this, we made the imported data for Wikidata 2022-11 available here: https://kgtk.isi.edu/#/data Perhaps that helps you, for now, to proceed with your work?
Hi @filievski, the link you provided is no longer available. It was available one month ago, and now everything has been deleted.
Hi @tommasocarraro, unfortunately, I am not at ISI anymore (since 14 months ago), and I am afraid that the KGTK codebase is not maintained anymore. Indeed, this also means that the website is now offline.
If this helps you, here is a dump from Wikidata from 2022-11-02, which I think is the same version that was previously on our website: https://drive.google.com/drive/folders/1F2u5RAFJuPCEZ7gFRuMVnig42qi-Pi2j?usp=drive_link
I'm trying to import the full wiki data zip, possibly just the English attributes, into the KGTK format for further analysis. The process runs for a few hours. I see from terminal that some of the processes have got up to 1.4 million lines processed (not sure of how many). While running I'm watching the system resources and seeing several kgtk processes using most of the machines memory between them. The number of kgtk processes drops over time. Now there are only two kgtk processes, neither using even 1% of memory and no CPU activity. It seems to have effectively stopped, yet the process is still displaying the last output of lines processed. So it seems it's still running, but it's ceased to do anything. It's been in this state for at least an hour.
To Reproduce Installed KGTK under python 3.9.15 in local conda env Downloaded zip of Wiki Data, ~70GB compressed, within the last 12 months. activate the conda env Run the command:
Expected behavior The process to continue running and using system resources to indicates it's doing something, until all the wiki data has been converted to the TSV format, or some useful error is thrown.
Additional context Not sure if the problem is caused by memory leaks or a deadlock issue 🤷♂️ After manually killing the process (Ctrl+C) The output throws an error: This three times:
Followed By this:
/home/ubuntu/anaconda3/envs/kgtk-env/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 76 leaked shared_memory objects to clean up at shutdown