Closed RibShark closed 7 years ago
Can you try with the following versions to see if they make any difference?
Thanks for looking into this! Neither DLL acts any differently, unfortunately. In the log for both your DLL and the normal d3d8to9 is Reference count for 'IDirect3DDevice8' object 065BA288 (5) is inconsistent.
, which is written as the game is exited.
I purchased the game on GOG and was able to see that some of the textures are missing, though it did not look exactly like your screenshots so I am not completely sure it is the same issue.
There is an issue with the reference count for 'IDirect3DDevice8', but I have not been able to figure that one out and I don't think it is the core issue (which is why I have not spent much time looking at it). There also appears to be an issue with the reference count of 'IDirect3DSurface8', which does seem to be the core issue. I tried a quick hack to fix the 'IDirect3DSurface8' reference count but that did not seem to help.
I could be wrong here (since I am not the original author), but it seems to be that when the game requests a surface (for example by calling GetSurfaceLevel or GetCubeMapSurface) that d3d8to9 creates a new Direct3DSurface8 surface rather then giving it the existing surface (which is what d3d9 returns). This is certainly leading to the 'IDirect3DSurface8' reference count issue because each time GetSurfaceLevel or GetCubeMapSurface is called d3d9 is increasing the reference count but the internal reference counter _ref
is not increased.
This problem in the code seems to be somewhat prevalent, as I see it in the following functions: GetBackBuffer, GetCubeMapSurface, GetSurfaceLevel, GetTexture, GetIndices, GetStreamSource and GetVolumeLevel.
I purchased the game on GOG and was able to see that some of the textures are missing, though it did not look exactly like your screenshots so I am not completely sure it is the same issue.
The effects seem to be different each time a level is loaded. The image above is one of the more dramatic-looking versions of the bug that I've seen.
I don't know if this is useful information, but running the game through PIX for Windows seemed to correct most of the issue, despite it still using d3d8to9. There were still some graphical errors, such as some of the particle effects that appear when you grab health in the first level not appearing, and random orange flashes appearing when an object moves (both of these issues are strangely gone in PIX's frame capture).
I don't know if this is useful information, but running the game through PIX for Windows seemed to correct most of the issue, despite it still using d3d8to9.
That sounds like an issue with Scarface: The World is Yours - in this case, the game starts showing terrible artifacts if running on more than one core (vertex buffer being populated from another thread without proper synchronization maybe? or issuing d3d calls from multiple threads?) and running the game from pix is said to help with said artifacts. Far fetched, but could be related.
That sounds like an issue with Scarface: The World is Yours - in this case, the game starts showing terrible artifacts if running on more than one core
That sounds amazingly similar to this issue. When I used msconfig to boot the system with one core, the issues went away even when I was using d3d8to9. Running the game with the affinity set to use only one core did not fix the issue however. Strange how the game is only affected by this issue when d3d8to9 is used (at least on my computer, apparently on some Intel cards there are missing textures with just d3d8).
Hi guys,
Can you post a nightly build with the latest commit (0c5e710) here? I do not have visual studio, therefore I am not able to compile it on my own.
Thanks.
There you go, build with logging enabled.
I tried with other D8 to D9 converters, like the one here and it has the same problem. This seems like a larger issue and is beyond my expertise.
Wouldn't be surprised if this was a game issue exposed by d3d9? To me, this really looks like invalid reuse of vertex buffers (had similar stuff happen in production code only because of wrong vb locking flags - and would expose itself only under heavy CPU load).
If I create the IDirect3D9 device where the Direct3D features are implemented in software using the D3DDEVTYPE_REF
device type then the missing textures can be seen. The textures only seem to be missing when using D3DDEVTYPE_HAL
.
The issue with Raymond 3 not showing some textures has to do with where the vertex buffers are stored. Raymond 3 uses D3DUSAGE_DYNAMIC
vertex buffers. It seems that the way Direct3D8 and Direct3D9 handle dynamic buffers have changed. Disabling dynamic buffers solves this issue.
You can disable dynamic buffers by adding this line to the beginning of the Direct3DDevice8::CreateVertexBuffer
function in d3d8to9:
Usage &= (D3DUSAGE_DYNAMIC ^ 0xFFFF);
I have also attached a working version of the d3d8to9 dll you can download here. Let me know if this works for you.
It seems that disabling D3DLOCK_DISCARD
in Direct3DVertexBuffer8::Lock
solved the problem also.
This can be done by adding this line to the beginning of the Direct3DVertexBuffer8::Lock
function:
Flags &= (D3DLOCK_DISCARD ^ 0xFFFF);
I am not really sure which change is better.
I can confirm this is the issue. I've narrowed down the main texture issue to one of the vertex buffers created by the function located at 0x486A30 in Rayman3.exe, and a secondary issue (that caused missing/delayed/miscoloured effects on NVIDIA cards and a crash on WineD3D) to the function located at 0x472430 in Rayman3.exe. Perhaps these could be analysed by someone smarter than me to see exactly what the game is doing that isn't supported by Direct3D9?
I think I figured out the issue here. The way D3DLOCK_DISCARD
works in Direct3D8 is very different from the way it works in Direct3D9.
Here is the D3DLOCK_DISCARD
documentation from Direct3D8:
The application overwrites the entire vertex buffer with a write-only operation. This enables Direct3D to return a pointer to a new memory area so that the dynamic memory access (DMA) and rendering from the old area do not stall.
Here is the D3DLOCK_DISCARD
documentation from Direct3D9:
The application discards all memory within the locked region. For vertex and index buffers, the entire buffer will be discarded. This option is only valid when the resource is created with dynamic usage.
Notice how in Direct3D9 the "entire buffer will be discarded" whereas in Direct3D8 this will "overwrites the entire vertex buffer with a write-only operation" and "return a pointer to a new memory area". The missing textures are happening because Direct3D9 discards the buffer rather than doing a "write-only" operation.
With Direct3D9 you need to use the D3DUSAGE_WRITEONLY
flag if you want the same functionality. And I think (though I cannot prove) that even with Direct3D8 you are suppose to use the D3DUSAGE_WRITEONLY
flag for this. In Raymond 3 a handful of vertex buffers call the Lock
function with D3DLOCK_DISCARD
enabled and D3DUSAGE_WRITEONLY
disabled and it is the exact same buffers that have this issue.
Therefore the right fix is to remove the D3DLOCK_DISCARD
flag from all lock operations if the D3DUSAGE_WRITEONLY
flag is not being used because this operation will certainly not do what the application expects.
This can be done by modifying this line in the Direct3DVertexBuffer8::Lock
function and adding || (desc.Usage & D3DUSAGE_WRITEONLY) == 0
so it looks like this:
if ((Flags & D3DLOCK_DISCARD) != 0)
{
D3DVERTEXBUFFER_DESC desc;
ProxyInterface->GetDesc(&desc);
if ((desc.Usage & D3DUSAGE_DYNAMIC) == 0 || (desc.Usage & D3DUSAGE_WRITEONLY) == 0)
{
Flags ^= D3DLOCK_DISCARD;
}
}
Also I recommend adding this code to the Direct3DIndexBuffer8::Lock
function, since the same problem can happen with index buffers:
if ((Flags & D3DLOCK_DISCARD) != 0)
{
D3DINDEXBUFFER_DESC desc;
ProxyInterface->GetDesc(&desc);
if ((desc.Usage & D3DUSAGE_DYNAMIC) == 0 || (desc.Usage & D3DUSAGE_WRITEONLY) == 0)
{
Flags ^= D3DLOCK_DISCARD;
}
}
Note: there are still issues on NVIDIA cards with the vertex buffer created in function 0x472430 (specifically, the one with size 96000), where effects that use this buffer can be missing, or have artifacts such as lingering orange flashes, incorrect models, and more. (This can be seen with the particles from the health powerups in the very first level, where the 2nd-4th health will not show it's particles). This happens regardless of whether d3d8to9 is used, so I'm not reopening this, but I thought it worth documenting here.
I'm trying to play Rayman 3: Hoodlum Havoc with d3d8to9, and I've run into a bit of an issue. When in any area of the game apart from the menu, various parts of the scenery can be highly deformed or appear completely invisible:![image](https://cloud.githubusercontent.com/assets/1957489/23103518/27a56396-f6b4-11e6-96bc-0157f69e7f22.png)
For comparison, this scene is supposed to look like this:![image](https://cloud.githubusercontent.com/assets/1957489/23103606/596d5536-f6b5-11e6-8674-72b833667cd9.png)
On some in game objects, the textures used are also incorrect (not sure if it's using the wrong texture or it is mapped incorrectly onto the model). Additionally, one in game object can even appear to be using the model of another object (the gem in this picture is actually a crown):![image](https://cloud.githubusercontent.com/assets/1957489/23103583/e16c173e-f6b4-11e6-8606-779174e0d927.png)
For comparison, this scene is supposed to look like this:![image](https://cloud.githubusercontent.com/assets/1957489/23103617/8c5028b6-f6b5-11e6-9ca3-e2449864de95.png)
This can be fixed while still using d3d8to9 by opening the file C:\Windows\Ubisoft\Ubi.ini and changing the line TnL=1 to TnL=0. After analysing the EXE, it seems that all this is doing is setting d3d to use Software Vertex Processing (but I could be wrong). The downside of this is that the framerate tanks to 20fps or less.
I'm using a GTX 970 with the latest drivers, if that's any help.