mikedh / trimesh

Python library for loading and using triangular meshes.
https://trimesh.org
MIT License
3k stars 580 forks source link

Tests for cascadio fail on s390x, wrong endianness #2251

Open musicinmybrain opened 3 months ago

musicinmybrain commented 3 months ago

I just got https://github.com/trimesh/cascadio packaged for Fedora, and the test is failing on s390x (the sole big-endian architecture). This is clearly a case of an endianness mixup; I’m filing the issue rather than on https://github.com/trimesh/cascadio because trimesh is involved, and it’s not clear exactly where the problem is.

=================================== FAILURES ===================================
_________________________________ test_convert _________________________________
    def test_convert():
        infile = os.path.join(cwd, "models", "featuretype.STEP")

        with tempfile.TemporaryDirectory() as D:
            outfile = os.path.join(D, "outfile.glb")
            # do the conversion
            cascadio.step_to_glb(infile, outfile, 0.1, 0.5)
>           scene = trimesh.load(outfile, merge_primitives=True)
tests/test_load.py:16: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.13/site-packages/trimesh/exchange/load.py:126: in load
    loaded = load_mesh(file_obj, file_type=file_type, resolver=resolver, **kwargs)
/usr/lib/python3.13/site-packages/trimesh/exchange/load.py:202: in load_mesh
    results = loader(file_obj, file_type=file_type, resolver=resolver, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
file_obj = <_io.BufferedReader name='/tmp/tmpjim9fbdf/outfile.glb'>
resolver = <trimesh.resolvers.FilePathResolver object at 0x3ff2df8c6e0>
ignore_broken = False, merge_primitives = True, skip_materials = False
mesh_kwargs = {'file_type': 'glb'}
head_data = b'glTF\x00\x00\x00\x02\x00\x01?\xb8\x00\x00\x04$NOSJ'
head = array([1179937895,   33554432, 3091136768,  604241920, 1246973774],
      dtype='<u4')
    def load_glb(
        file_obj: Stream,
        resolver: Optional[ResolverLike] = None,
        ignore_broken: bool = False,
        merge_primitives: bool = False,
        skip_materials: bool = False,
        **mesh_kwargs,
    ):
        """
        Load a GLTF file in the binary GLB format into a trimesh.Scene.

        Implemented from specification:
        https://github.com/KhronosGroup/glTF/tree/master/specification/2.0

        Parameters
        ------------
        file_obj : file- like object
          Containing GLB data
        resolver : trimesh.visual.Resolver
          Object which can be used to load other files by name
        ignore_broken : bool
          If there is a mesh we can't load and this
          is True don't raise an exception but return
          a partial result
        merge_primitives : bool
          If True, each GLTF 'mesh' will correspond to a
          single Trimesh object.
        skip_materials : bool
          If true, will not load materials (if present).

        Returns
        ------------
        kwargs : dict
          Kwargs to instantiate a trimesh.Scene
        """
        # read the first 20 bytes which contain section lengths
        head_data = file_obj.read(20)
        head = np.frombuffer(head_data, dtype="<u4")

        # check to make sure first index is gltf magic header
        if head[0] != _magic["gltf"]:
            raise ValueError("incorrect header on GLB file")

        # and second value is version: should be 2 for GLTF 2.0
        if head[1] != 2:
>           raise NotImplementedError(f"only GLTF 2 is supported not `{head[1]}`")
E           NotImplementedError: only GLTF 2 is supported not `33554432`
/usr/lib/python3.13/site-packages/trimesh/exchange/gltf.py:388: NotImplementedError
=========================== short test summary info ============================
FAILED tests/test_load.py::test_convert - NotImplementedError: only GLTF 2 is...
============================== 1 failed in 0.27s ===============================

If I do the following in an emulated s390x chroot

>>> import cascadio
>>> infile = 'models/featuretype.STEP'
>>> outfile='outfile.glb'
>>> import cascadio
>>> cascadio.step_to_glb(infile, outfile, 0.1, 0.5)

then I get an outfile.glb that starts as follows:

hexdump -C outfile.glb  | head -n 20
00000000  67 6c 54 46 00 00 00 02  00 01 3f b8 00 00 04 24  |glTF......?....$|
00000010  4e 4f 53 4a 7b 22 61 63  63 65 73 73 6f 72 73 22  |NOSJ{"accessors"|
00000020  3a 5b 7b 22 62 75 66 66  65 72 56 69 65 77 22 3a  |:[{"bufferView":|
00000030  30 2c 22 62 79 74 65 4f  66 66 73 65 74 22 3a 30  |0,"byteOffset":0|
00000040  2c 22 63 6f 6d 70 6f 6e  65 6e 74 54 79 70 65 22  |,"componentType"|
00000050  3a 35 31 32 36 2c 22 63  6f 75 6e 74 22 3a 32 37  |:5126,"count":27|
00000060  30 34 2c 22 6d 61 78 22  3a 5b 30 2e 30 36 33 35  |04,"max":[0.0635|
00000070  2c 30 2e 30 33 31 37 35  2c 30 2e 30 33 34 39 32  |,0.03175,0.03492|
00000080  35 5d 2c 22 6d 69 6e 22  3a 5b 2d 30 2e 30 36 33  |5],"min":[-0.063|
00000090  35 2c 2d 30 2e 30 33 31  37 35 2c 2d 36 2e 39 33  |5,-0.03175,-6.93|
000000a0  38 38 39 33 39 30 33 39  30 37 32 32 39 65 2d 31  |8893903907229e-1|
000000b0  38 5d 2c 22 74 79 70 65  22 3a 22 56 45 43 33 22  |8],"type":"VEC3"|
000000c0  7d 2c 7b 22 62 75 66 66  65 72 56 69 65 77 22 3a  |},{"bufferView":|
000000d0  31 2c 22 62 79 74 65 4f  66 66 73 65 74 22 3a 30  |1,"byteOffset":0|
000000e0  2c 22 63 6f 6d 70 6f 6e  65 6e 74 54 79 70 65 22  |,"componentType"|
000000f0  3a 35 31 32 36 2c 22 63  6f 75 6e 74 22 3a 32 37  |:5126,"count":27|
00000100  30 34 2c 22 74 79 70 65  22 3a 22 56 45 43 33 22  |04,"type":"VEC3"|
00000110  7d 2c 7b 22 62 75 66 66  65 72 56 69 65 77 22 3a  |},{"bufferView":|
00000120  32 2c 22 62 79 74 65 4f  66 66 73 65 74 22 3a 30  |2,"byteOffset":0|
00000130  2c 22 63 6f 6d 70 6f 6e  65 6e 74 54 79 70 65 22  |,"componentType"|

and ends with

00013e90  0a 4e 0a 5a 0a 4e 0a 5b  0a 59 0a 50 0a 4f 0a 59  |.N.Z.N.[.Y.P.O.Y|
00013ea0  0a 4f 0a 5a 0a 58 0a 51  0a 50 0a 58 0a 50 0a 59  |.O.Z.X.Q.P.X.P.Y|
00013eb0  0a 57 0a 52 0a 51 0a 57  0a 51 0a 58 0a 56 0a 53  |.W.R.Q.W.Q.X.V.S|
00013ec0  0a 52 0a 56 0a 52 0a 57  0a 55 0a 54 0a 53 0a 55  |.R.V.R.W.U.T.S.U|
00013ed0  0a 53 0a 56 0a 6b 0a 5d  0a 5c 0a 6b 0a 5e 0a 5d  |.S.V.k.].\.k.^.]|
00013ee0  0a 6a 0a 5f 0a 5e 0a 6a  0a 5e 0a 6b 0a 69 0a 60  |.j._.^.j.^.k.i.`|
00013ef0  0a 5f 0a 69 0a 61 0a 60  0a 69 0a 5f 0a 6a 0a 68  |._.i.a.`.i._.j.h|
00013f00  0a 61 0a 69 0a 67 0a 62  0a 61 0a 67 0a 61 0a 68  |.a.i.g.b.a.g.a.h|
00013f10  0a 66 0a 63 0a 62 0a 66  0a 62 0a 67 0a 65 0a 64  |.f.c.b.f.b.g.e.d|
00013f20  0a 63 0a 65 0a 63 0a 66  0a 7b 0a 6d 0a 6c 0a 7b  |.c.e.c.f.{.m.l.{|
00013f30  0a 6e 0a 6d 0a 7a 0a 6f  0a 6e 0a 7a 0a 6e 0a 7b  |.n.m.z.o.n.z.n.{|
00013f40  0a 79 0a 70 0a 6f 0a 79  0a 71 0a 70 0a 79 0a 6f  |.y.p.o.y.q.p.y.o|
00013f50  0a 7a 0a 78 0a 71 0a 79  0a 77 0a 72 0a 71 0a 77  |.z.x.q.y.w.r.q.w|
00013f60  0a 71 0a 78 0a 76 0a 73  0a 72 0a 76 0a 72 0a 77  |.q.x.v.s.r.v.r.w|
00013f70  0a 75 0a 74 0a 73 0a 75  0a 73 0a 76 0a 7d 0a 7e  |.u.t.s.u.s.v.}.~|
00013f80  0a 7c 0a 7d 0a 7f 0a 7e  0a 83 0a 80 0a 81 0a 83  |.|.}...~........|
00013f90  0a 82 0a 80 0a 86 0a 84  0a 85 0a 86 0a 85 0a 87  |................|
00013fa0  0a 8a 0a 88 0a 89 0a 8a  0a 89 0a 8b 0a 8d 0a 8e  |................|
00013fb0  0a 8c 0a 8d 0a 8f 0a 8e                           |........|
00013fb8

I can see that the version field is packed in big-endian order. From https://github.com/KhronosGroup/glTF/blob/082d5a98f479c37dff18767982d1431fc6c014fd/specification/2.0/Specification.adoc, it seems like that should be considered a bug – but perhaps it’s in OCCT? I’m curious to see what you think.

For now, I plan to build the new python-cascadio package on little-endian platforms only.

I’m mentioning https://github.com/mikedh/trimesh/issues/1351 and https://github.com/mikedh/trimesh/issues/249 not because they necessarily share a root cause, but just to cross-link issues about big-endian-related test failures.

musicinmybrain commented 3 months ago

Please do let me know if there is anything I can add or investigate to make this bug report more useful.