Open greeksharifa opened 1 year ago
Hi, @greeksharifa did you find a solution for this ?
Hi, @greeksharifa did you find a solution for this ?
No, I didn't.
My solution was to change json with PICKLE , it reduced the size from 29MB to 12MB for 4 images. I think it still big but this is due to the large number of objects stored. Try to play with maskrcnn_benchmark/engine/inference.py and reduce bbx or delete structures you don't need.
consider setting the "`MODEL.ROI_HEADS.DETECTIONS_PER_IMG" into an smaller value in your command
(default is 80 in "configs/e2e_relation_X_101_32_8_FPN_1x.yaml" file)
it will considerably shrink the size of the output "custom_prediction.json" file
but be aware that the size of "custom_prediction.json" file is dependent on the number of images existing in your "DETECTED_SGG_DIR" directory, so you might need to exclude some of your images from this directory and put them in another one and try to generate scene graphs from them with another command!
❓ Questions and Help
During running in custom dataset... I got the error:
It died with <Signals.SIGKILL: 9>.
So, I run the
tools/relation_test_net.py
in only 10 images, it successfully executed, but the output file is too big:6 Megabytes per 1 image...
What's the problem? Should I edit some config files? where or what?
Thanks.