Closed boriskaus closed 1 year ago
It seems like this PR contains the changes from #17 as well. I will thus wait with a review until #17 is merged into main
and main
then merged into this PR here. Feel free to ping me once this is ready for review 🙂
@sloede: It's a bit mysterious why the last CI run failed (nothing to do with the package, it seems). Anyways, I now merged main
, so it is ready for you to have a look at.
I've implemented it all, and it seems to run fine. I see that you only test on Julia 1.6; perhaps an idea to also add a more recent version of Julia there (I know it should be backwards compatible, but you never know).
Also, the version # should probably be bumped. With this commit, I have all the functionality we need from ReadVTK (unless bugs are found later)
Ah, there seems to be an error with at least one of the tests...
Hmm, I had the same error a while ago. It started when I introduced local x,y,z
in structuredgrid.jl
but seems to trigger an error in a completely different test. I have no idea why that would be.
So this mistake sometimes occurs in CI and sometimes not, even when we cleanup the files altogether. I don't like this at all...
So this mistake sometimes occurs in CI and sometimes not, even when we cleanup the files altogether. I don't like this at all...
Can you try fixing it? It seems to me like a very specific error, not related to floating point truncation.
I have no clue where to start. Seems that the files written to disc sometimes have a different size. Not having a local machine that reproduces this error does not help..
Note that the main branch is also failing now with what feels to be a fairly similar problem.
@sloede: Whereas CI is fine now, the error remains weird and occurs when reading compressed data back from a file here:
raw = data_array.vtk_file.appended_data
HeaderType = header_type(data_array.vtk_file)
if is_compressed(data_array)
# If data is stored compressed, the first four integers of type `header_type` are the header and
# the fourth value contains the number of bytes to read
first = data_array.offset + 1
last = data_array.offset + 4 * sizeof(HeaderType)
header = Int.(reinterpret(HeaderType, raw[first:last]))
n_bytes = header[4]
first = data_array.offset + 4 * sizeof(HeaderType) + 1
last = first + n_bytes - 1
uncompressed = transcode(ZlibDecompressor, raw[first:last])
else
...
end
in some cases, the length of the raw
vector is shorter than last
as indicated in the header. The CI failure in the main
branch seems to be caused by the same issue.
In my last commit, I simply added an error message when this occurs, but it doesn't cure the cause of it.
Are there perhaps changes in the upstream packages that could cause this?
in some cases, the length of the
raw
vector is shorter thanlast
as indicated in the header. The CI failure in themain
branch seems to be caused by the same issue. In my last commit, I simply added an error message when this occurs, but it doesn't cure the cause of it.
Thanks for investigating the issue. Do you know why the error now disappeared? And did it only occur in cases where the file in question was generated on-the-fly or also in cases where an existing vtk file is used?
Are there perhaps changes in the upstream packages that could cause this?
I believe we had a similar issue before - @andrewwinters5000 do you still remember where and when this occurred? I do not remember what we did back then to resolve it, but I recall that we had a similar issue of files being "truncated" by the zip/unzip mechanism.
in some cases, the length of the
raw
vector is shorter thanlast
as indicated in the header. The CI failure in themain
branch seems to be caused by the same issue. In my last commit, I simply added an error message when this occurs, but it doesn't cure the cause of it.Thanks for investigating the issue. Do you know why the error now disappeared? And did it only occur in cases where the file in question was generated on-the-fly or also in cases where an existing vtk file is used?
Are there perhaps changes in the upstream packages that could cause this?
I believe we had a similar issue before - @andrewwinters5000 do you still remember where and when this occurred? I do not remember what we did back then to resolve it, but I recall that we had a similar issue of files being "truncated" by the zip/unzip mechanism.
I also vaguely remember some spurious issues like these. If I remember correctly, they had some stochastic part, i.e., they were not triggered every time - maybe depending on the GitHub runners etc.? Some parts may have been specific to Windows or so...
in some cases, the length of the
raw
vector is shorter thanlast
as indicated in the header. The CI failure in themain
branch seems to be caused by the same issue. In my last commit, I simply added an error message when this occurs, but it doesn't cure the cause of it.Thanks for investigating the issue. Do you know why the error now disappeared?
No, if you look at the various CI's of yesterday, it sometimes appears and sometimes not (on different systems)
And did it only occur in cases where the file in question was generated on-the-fly or also in cases where an existing vtk file is used?
I believe it is only with cases where the file was generated on-the-fly.
Are there perhaps changes in the upstream packages that could cause this?
I believe we had a similar issue before - @andrewwinters5000 do you still remember where and when this occurred? I do not remember what we did back then to resolve it, but I recall that we had a similar issue of files being "truncated" by the zip/unzip mechanism.
OK. Since the error seems to have disappeared, once the last remaining issues are resolved, we can merge this.
so we now have 1 failed CI and 2 that passed, with the only thing that changed being:
@show data_array data_array.vtk_file.xml_file
and removing:
# Start with a clean environment: remove example file directory if it exists
isdir(TEST_EXAMPLES_DIR) && rm(TEST_EXAMPLES_DIR, recursive=true)
mkpath(TEST_EXAMPLES_DIR)
It seems to be rather stochastic and I have no idea where to look. I guess we can probably merge (and increment the version number) and see if it keeps occurring.
It seems to be rather stochastic and I have no idea where to look.
That's also what I remember from similar issues earlier - it's stochastic and when it fails, it's on Windows.
I guess we can probably merge (and increment the version number) and see if it keeps occurring.
Sounds good to me. Thanks a lot for your contributions!
This PR adds support for
StructuredGrid
(*.vts
) files as well as their parallel counterpartsPStructuredGrid
(*.pvts
)