I saw at PSX tech-docs many GTE opcodes for polygon calculation including depth-sorting, as this cop cares for that task and pass it to GPU for drawing onscreen.
So theorically; it's possible to reconstruct a non-flat 3D textured mesh by getting three raw-data dumps from a given frame capture, coding emulator/gpu plugin to fwrite(&) at the exact spots:
GTE polygon projection/3D-to-2D maths. This dump requires reversing back from 2D to 3D, though.
GPU projected polygons for actual geommetry.
VRAM texture/palette/uvs data.
The resultant three chunks can be merged together into a common 3D format. Probably this require an external converter tool, as it has nothing to do with accuracy/performance, but it would be a neat and dreamed feature.
I saw at PSX tech-docs many GTE opcodes for polygon calculation including depth-sorting, as this cop cares for that task and pass it to GPU for drawing onscreen. So theorically; it's possible to reconstruct a non-flat 3D textured mesh by getting three raw-data dumps from a given frame capture, coding emulator/gpu plugin to fwrite(&) at the exact spots: