AcademySoftwareFoundation / OpenImageIO

Reading, writing, and processing images in a wide variety of file formats, using a format-agnostic API, aimed at VFX applications.
https://openimageio.readthedocs.org
Apache License 2.0
1.97k stars 594 forks source link

Running the testsuite on Windows #2591

Closed nasefbasdf closed 1 year ago

nasefbasdf commented 4 years ago

Hi!

I've compiled OIIO on Windows with MSC 1900 (Visual Studio 2015). I've specifically wanted it to have support for freetype, png, jpeg, tiff and openexr. Every dependency and oiio itself has been successfully compiled, but the final step has many issues: oiio-tests.

Firstly when I start the tests they all fail by not being able to call "ls", the unix tool for listing a directory. Since I'm on Windows this cannot work. I've found that this call is only for information and can be ignored so I commented out the line in testsuite/runtest.py.

Secondly the very first test (gpsread) is not passing because the SHA-1 hashes do not match. Here is the diff:

--- out.txt Mon May 25 11:15:02 2020
+++ ref\out-alt.txt Sun May 10 20:43:52 2020
@@ -1,6 +1,6 @@
-Reading ../../../../oiio-images/tahoe-gps.jpg
-../../../../oiio-images/tahoe-gps.jpg : 2048 x 1536, 3 channel, uint8 jpeg
-    SHA-1: B5183E383D55949944FE40DC55A67844739E0B7B
+Reading ../../../../../../../../../oiio-images/tahoe-gps.jpg
+../../../../../../../../../oiio-images/tahoe-gps.jpg : 2048 x 1536, 3 channel, uint8 jpeg
+    SHA-1: 71EBEC73B8E8B3533780B15147E61D895C80E8B1
     channel list: R, G, B
     Make: "HTC"
     Model: "T-Mobile G1"
@@ -32,7 +32,7 @@
     oiio:ColorSpace: "sRGB"
 Reading tahoe-gps.jpg
 tahoe-gps.jpg        : 2048 x 1536, 3 channel, uint8 jpeg
-    SHA-1: A74C7DF2B01825DCB6881407AE77C11DC56AB741
+    SHA-1: 2623446988E34395C6B0A4AA4FC75107C708BF18
     channel list: R, G, B
     Make: "HTC"
     Model: "T-Mobile G1"

This could very well mean that something is not compiled/built correctly. However we use MTOA, which has oiio bundled and that oiiotool generates the same mismatching hash that our build does. I would hope that the bundled oiio tool is correct at the least.

So the next reason I can think of is that some internal text encoding or line-ending (CRLF/LF) difference of the unix/windows platforms makes the hash differ. I can't/do not know how to check this theory. There are other errors that emerge from the test suite, but most assertions seem to pass correctly.

I ultimately would like to have a build that passes its tests. To this end I have multiple questions:

  1. Is this issue known? (since the MTOA bundled oiiotool would face the same hash-conflict in the test I can't believe I'm the first one to find it)
  2. Am I supposed to use an MSYS-like environment for building? (that would solve the call to ls problem)
  3. Could this be something related to a dependency I used? Here are all the dependencies I've built and used:
    • ZLib (1.2.11)
    • OpenEXR (2.4.1)
    • LibJPEG (9c)
    • LibLZMA (5.2.5)
    • LibTiff (4.1.0)
    • Boost (1.73.0)
    • LibPNG (1.6.37)
    • Freetype (2.10.2)

Thanks,

lgritz commented 4 years ago

I wouldn't worry about it (in this test) if the only difference is the SHA.

For "lossless" formats, it's important to preserve the exact bit pattern.

JPEG compression is lossy -- by design, it does not preserve the exact data values, but it's close enough that it looks the same to the eye. I've found that the exact reconstruction of values after going through the original -> jpeg compressed -> uncompressed round trip can vary ever so slightly depending on the version of libjpeg, the platform it's running on, compiler version and flags.

In fact, in that test, we have two reference outputs in testsuite/gpsread/ref, reflecting two different SHAs we've gotten on the various test machines we use to verify releases. You seem to have stumbled upon a third. It's not anything to worry about.

lgritz commented 4 years ago

You also seem to have something funny going on with your paths (../../../../oiio-images/tahoe-gps.jpg vs ../../../../../../../../../oiio-images/tahoe-gps.jpg. I suspect that is the consequence of some symbolic links in your path, or something like that.

If you are needing to have full tests done on a regular basis for production deployment, I'll work with you to iron out all these little details so you get a cleanly passed testsuite with none of these little meaningless failures. But if this is just a one-off to see if you built it properly, I wouldn't advise spending much time on differences like the paths mangling.

nasefbasdf commented 4 years ago

Alright, there are other failures though. The example I posted was just the first one in order.

Here is a different error during the oiiotool. This seems to be the only failure for the ~90 test cases in the oiiotool suite.

1>EXEC : oiiotool error : -cut : Unrecognized geometry "'{TOP.width-20*"
1>  Full command line was:
1>  > ..\\..\\bin\\oiiotool -colorconfig ../../../testsuite\\common\\OpenColorIO\\nuke-default\\config.ocio src/tahoe-small.tif -cut '{TOP.width-20* 2}x{TOP.height-40+(4*2- 2 ) /6-1}+{TOP.x+100.5-80.5 }+{TOP.y+20}' -d uint8 -o exprcropped.tif

The oiiotool-attribs suite fails with hash conflicts similarly to the first one.

During the oiiotool-readerror test the output differs by the path separators. This seems to be an issue with the test clearly:

...
1>  Diff out.err.txt vs ref\out.err-alt.txt was:
1>  -------
1>  --- out.err.txt Wed May 27 08:44:24 2020
1>  +++ ref\out.err-alt.txt Sun May 10 20:43:52 2020
1>  @@ -1,3 +1,3 @@
1>EXEC : oiiotool error : read src/incomplete.exr : Failed OpenEXR read: Error reading pixel data from image file "src/incomplete.exr". Unexpected end of file.
1>   Full command line was:
1>  -> ..\\..\\bin\\oiiotool -colorconfig ../../../testsuite\\common\\OpenColorIO\\nuke-default\\config.ocio src/incomplete.exr -o out.exr
1>  +> ../../bin/oiiotool -colorconfig ../../../testsuite/common/OpenColorIO/nuke-default/config.ocio src/incomplete.exr -o out.exr
...

There seems to be some issue with calling idiff.exe as well. There are a lot of snippets when diffing the output txt files like:

1>  Diff out.txt vs ref/out.txt was:
1>  -------
1>  --- out.txt Wed May 27 08:44:31 2020
1>  +++ ref/out.txt Sun May 10 20:43:52 2020
1>  @@ -1,620 +1,62 @@
1>  -idiff -- compare two images
1>  -OpenImageIO 2.1.15 http://www.openimageio.org
1>  -Usage:  idiff [options] image1 image2
1>  -    --help           Print help message
1>  -    -v               Verbose status messages
1>  -    -q               Quiet (minimal messages)
1>  -    -a               Compare all subimages/miplevels
1>  -Thresholding and comparison options
1>  -    -fail %g         Failure threshold difference (0.000001)
1>  -    -failpercent %g  Allow this percentage of failures (0)
1>  -    -hardfail %g     Fail if any one pixel exceeds this error (infinity)
1>  -    -warn %g         Warning threshold difference (0.00001)
1>  -    -warnpercent %g  Allow this percentage of warnings (0)
1>  -    -hardwarn %g     Warn if any one pixel exceeds this error (infinity)
1>  -    -p               Perform perceptual (rather than numeric) comparison
1>  -Difference image options
1>  -    -o %s            Output difference image
1>  -    -od              Output image only if nonzero difference
1>  -    -abs             Output image of absolute value, not signed difference
1>  -    -scale %g        Scale the output image by this factor
...

Later in the oiiotool-maketx test, there are floating point errors that could be considered meaningful, but I'm not sure:

...
1>       MIP 5 of 7 (2 x 2):
1>  -      Stats Min: 0.497070 -0.001317 -0.000958 0.001850 0.001672 -0.000017 (float)
1>  +      Stats Min: 0.497070 -0.001317 -0.000958 0.001850 0.001672 -0.000000 (float)
1>         Stats Max: 0.501953 0.000820 0.000466 0.002174 0.001898 0.000156 (float)
1>  -      Stats Avg: 0.499756 -0.000458 -0.000042 0.001971 0.001789 0.000035 (float)
1>  -      Stats StdDev: 0.002028 0.000785 0.000576 0.000123 0.000084 0.000071 (float)
1>  +      Stats Avg: 0.499756 -0.000458 -0.000042 0.001971 0.001789 0.000039 (float)
1>  +      Stats StdDev: 0.002028 0.000785 0.000576 0.000123 0.000084 0.000068 (float)
1>         Stats NanCount: 0 0 0 0 0 0
1>         Stats InfCount: 0 0 0 0 0 0
1>         Stats FiniteCount: 4 4 4 4 4 4
1>         Constant: No
1>         Monochrome: No
...

The error log is a little hard to read. I've only ran 10/132 tests before cancelling and it was already over 8k of lines.

Some of these errors may be interesting/relevant. What do you think of them?

lgritz commented 1 year ago

Closing this old issue. Lots has changed since then. If it's still a problem, please re-open or file a new issue.