Implements a more accurate model for z interp which results in closely replicating the precision loss.
The amount of precision lost is directly correlated to how wide a polygon is.
This behavior seems to only apply to interpolation along x?
One theory I have is that it takes the difference between the left and right depth value and right shifts by 1. Followed by, in the case where z0 > z1, adding the remainder of the division of (z0-z1) and xdiff towards the end of the process.
Note: there are a few alternative ways to do this, while still getting an equivalent result. Such as by left shifting xdiff by 1 before division.
Implements a more accurate model for z interp which results in closely replicating the precision loss.
The amount of precision lost is directly correlated to how wide a polygon is. This behavior seems to only apply to interpolation along x? One theory I have is that it takes the difference between the left and right depth value and right shifts by 1. Followed by, in the case where z0 > z1, adding the remainder of the division of (z0-z1) and xdiff towards the end of the process. Note: there are a few alternative ways to do this, while still getting an equivalent result. Such as by left shifting xdiff by 1 before division.
Some samples: before > after > console
![image](https://github.com/melonDS-emu/melonDS/assets/102590697/d28016c5-0152-467f-8501-a4aa4848b363)