Closed pmelchior closed 3 years ago
Just curious, is this on a local disk or remote (gpfs/nfs/etc). But I don't know...
local disk. I have the suspicion that is comes from files where the polygon numbering/labeling is screwed up. I've been using this file.
You can see that the polygon ids jump from 19 to 159. The total number is correct, but I have also found examples where even the number isn't right.
we should put in more fine grained error messages to see where this is coming from, e.g. during opening or reading
That would help, but this one has a runtime aspect. It works initially, then not. it's worse on jupyter than in python...
I'm not seeing it on my local disk.
import pymangle
f = 'round17-poly_tidy.ply'
for i in range(1000):
p = pymangle.Mangle(f)
I also don't see any fail on gpfs
are you py3 or py2?
py3
I believe that Liz Buckley-Geer was seeing this on her Mac (but possibly not on a SL6 system). I'm not sure if she is on GitHub to @
.
Mac for me, too. I'm satisfied (sort of) if this is exclusively a Mac issue. It would be nice to have it fixed, though.
Yes, liz told me about it also.
I don't have a mac.
what happens if you insert an artificial delay between reads?
After some time, I can read the file just fine. On the console, this time is of order 1 s, in jupyter it's quite a bit longer, maybe 15-30 s. It also grows with larger files (more polygons or more edges per polygon, not sure)
Actually, in jupyter I can't seem to load larger mangle files at all, while the same files work on the console.
I did fix a minor bug with closing the file in the PR for this issue #16 . I doubt it will matter, but can you give it a try when you get a chance?
nope, still doesn't work
I managed to work around it on my Mac by putting the read in a try/except block and repeating the read if I get the error up to 10 times. But that only worked for smaller masks. When I had one of 1550 pixels it failed to read even after 10 tries. So I have given up and moved to one of our linux cluster machines where it works without a problem. So I think it is definitely a Mac problem of some sort.
I wonder if a simple retry of the fopen
call, after some wait period, would do it
Can you point me to the code line, please
I just had a failure on our Linux machine.
So I definitely also see failures on Linux as well as my mac
need to reopen this...
Hi, just wanted to say that I have been having this issue too. Been using pymangle to open .ply files, which i then use in a for loop for creating a large number of randoms for a mock astronomy catalogue (in a jupyter notebook). Am encountering this issue at random points in the for loop. Think I have managed to get a workaround for it / avoid it but yeah, can be a bit annoying if it kills the run half way through!
Are you using the latest master? I thought we put in a fix for this.
Actually just noticed.. I'm using the current version on pip, which is maybe one behind? Is there any chance of getting the pip version up to latest? I'm running this on a hpc so it's managed by their admins.
Yes, I'll bring pip up to date. Would it be possible to try out master to confirm it works for you before I cut the new version?
Yes, I'll bring pip up to date. Would it be possible to try out master to confirm it works for you before I cut the new version?
Im happy to try this :) but excuse my inexperience, how would I go about pointing a jupyter notebook to the 0.9.2 version if i download it from this github?
I updated the pypi version to 0.9.2 please try upgrading with pip
Thanks, works perfectly now!
great, I'm going to close. Thanks for the report.
I've come across a weird runtime error. When I'm executing this command
multiple times in quick succession, it will almost always fail:
however, the polygon number where it fails is different every time. Is this because the file gets locked during opening, and I'm executing two effectively concurrent reads?
I noticed the problem in a jupyter notebook, where I have to wait for, say, 1 minute between these calls. I tested it with a plain python interpreter, and the problem exists, but I have to trigger the calls much faster.