Closed MichaelGoodale closed 5 years ago
FYI this also happens when trying to import the Switchboard corpus.
Just to double check something, this happened when running the non-dockerized version?
@james-tanner ok, so I've updated the Neo4j version in iscan-spade-server to the latest version, which had some performance improvements that may be related to this error. Do you think you could try running the import again on Oka and see if the same thing happens? Be sure to run the reset_database script after pulling the new changes from iscan-spade-server.
Ok, I think I've figured out a solution, you don't need to test Oka, it's still an issue even with updated Neo4j. I'm revising some cypher statements in a way that seems to get around the issue.
@james-tanner This issue should be resolved now with the latest version of PolyglotDB. For iscan-spade-server, update polyglotdb via pip install -r requirements.txt -U
and it should fetch the newest version. Double check if it's fixed when you try switchboard.
Still getting the same error on Oka after following these instructions for both SOTC and Switchboard.
This is after pulling the latest changes to the repo & updating with pip install -r requirements.txt -U
.
Did you restart the celery instance after updating? If not, try it after that?
@mmcauliffe Just tried this and still fails for both Switchboard and SOTC.
This is now fine on a non-Docker machine with 8GB memory. @MichaelGoodale is this still an issue for your machine, or can this be closed?
@MichaelGoodale ? pinging him on slack too.
Whoops, didn't see this notification I guess. I haven't tried re-importing on my laptop yet, but I'll try today and see if it has an effect. I only have 4 gigs of RAM though on it so if it doesn't work, I don't know if it's that big a deal.
So, when I tried to impore the Spade-ICE-Can corpus, I get an out of memory error when I have about half a gig of RAM left.