Closed DubbleClick closed 8 months ago
For what it's worth:
Unchanged LoadWICTextureFromMemoryEx:
Changed LoadWICTextureFromMemoryEx:
D3DXCreateTextureFromFileInMemoryEx:
Notice the tree texture being much clearer on the last image.
Sorry, digging a bit futher into it, it seems the format is interpreted as _GUID{0x6fddc324, 0x4e03, 0x4bfe, {0xb1, 0x85, 0x3d, 0x77, 0x76, 0x8d, 0xc9, 0xe}
like I mentioned above, which actually corresponds to GUID_WICPixelFormat32bppBGR
.
Forcing it to be interpreted as RGBA instead (passing WIC_LOADER_FORCE_RGBA32 does not do the trick. since it then interprets as RGB and converts to RGBA) I get the result above - adding WIC_LOADER_MIP_AUTOGEN and they look as intended.
So the question is only how to check when D3DX would interpret the source format differently than the WIC loader currently does with the call to frame->GetPixelFormat()
.
The key thing to remember is that older D3DX9 doesn't use WIC at all, and is using a very old copy of libpng for that file. I believe the most recent D3DX9 is using WIC in a limited way, and from the code it always loads PNG, BMP, and JPG as D3DFMT_A8R8G8B8
by converting it to GUID_WICPixelFormat32bppBGRA
no matter what the source format was.
Why are the sample images different sizes?
The images posted in the second comment are screenshots of the textures being rendered in the game. The zip file in the first comment contains the texture files.
Thank you for the explanation why it may differ! How would I go about forcing these images to be interpreted as ARGB even though the data indicates RGB?
Hook the WinApi GetPixelFormat method before calling CreateWICTextureFromMemory?
https://github.com/gwdevhub/gMod/blob/dev/cmake/dxtk.cmake#L15
This is the current workaround, but I'd rather use a solution in code without modifying the WicLoader source.
Okay, I've figured it out - mostly. I'm converting all user submitted wic textures to dds during application load and just use LoadDDSTextureFromMemoryEx in the end. By loading the image data before saving it to dds memory, I can check and override the format (albeit technically with undefined behaviour).
DirectX::Blob ConvertImageToDDS(TexEntry& entry)
{
// Other files need to be converted to DDS
DirectX::ScratchImage image;
DirectX::ScratchImage bgra_image;
DirectX::ScratchImage dds_image;
DirectX::TexMetadata metadata{};
HRESULT hr = 0;
if (entry.ext == ".tga") {
hr = DirectX::LoadFromTGAMemory(entry.data.data(), entry.data.size(), DirectX::TGA_FLAGS_BGR, &metadata, image);
}
else if (entry.ext == ".hdr") {
hr = DirectX::LoadFromHDRMemory(entry.data.data(), entry.data.size(), &metadata, image);
}
else {
hr = DirectX::LoadFromWICMemory(entry.data.data(), entry.data.size(), DirectX::WIC_FLAGS_NONE, &metadata, image);
if (metadata.format == DXGI_FORMAT_B8G8R8X8_UNORM) {
metadata.format = DXGI_FORMAT_B8G8R8A8_UNORM;
// todo: this is undefined behaviour, but we must force them to be interpreted as BGRA instead of BGRX
const auto images = image.GetImages();
for (int i = 0; i < image.GetImageCount(); ++i) {
const_cast<DXGI_FORMAT&>(images[i].format) = DXGI_FORMAT_B8G8R8A8_UNORM;
}
const_cast<DXGI_FORMAT&>(image.GetMetadata().format) = DXGI_FORMAT_B8G8R8A8_UNORM;
}
}
if (FAILED(hr)) {
Warning("LoadImageFromMemory (%#lX%s): FAILED\n", entry.crc_hash, entry.ext.c_str());
return {};
}
if (metadata.format != DXGI_FORMAT_B8G8R8A8_UNORM) {
hr = DirectX::Convert(
image.GetImages(),
image.GetImageCount(),
image.GetMetadata(),
DXGI_FORMAT_B8G8R8A8_UNORM,
DirectX::TEX_FILTER_DEFAULT,
DirectX::TEX_THRESHOLD_DEFAULT,
bgra_image);
if (FAILED(hr)) {
Warning("ConvertToBGRA (%#lX%s): FAILED\n", entry.crc_hash, entry.ext.c_str());
bgra_image = std::move(image);
}
}
else {
bgra_image = std::move(image);
}
hr = DirectX::GenerateMipMaps(
bgra_image.GetImages(),
bgra_image.GetImageCount(),
bgra_image.GetMetadata(),
DirectX::TEX_FILTER_DEFAULT,
0,
dds_image);
if (FAILED(hr)) {
Warning("GenerateMipMaps (%#lX%s): FAILED\n", entry.crc_hash, entry.ext.c_str());
dds_image = std::move(bgra_image);
}
DirectX::Blob dds_blob;
hr = DirectX::SaveToDDSMemory(
dds_image.GetImages(),
dds_image.GetImageCount(),
dds_image.GetMetadata(),
DirectX::DDS_FLAGS_NONE,
dds_blob);
if (FAILED(hr)) {
Warning("SaveDDSImageToMemory (%#lX%s): FAILED\n", entry.crc_hash, entry.ext.c_str());
return {};
}
return dds_blob;
}
// use
const auto dds_blob = ConvertToDDS(entry, dll_path);
texture_file_struct->data.assign(static_cast<BYTE*>(dds_blob.GetBufferPointer()), static_cast<BYTE*>(dds_blob.GetBufferPointer()) + dds_blob.GetBufferSize());
I do believe that it would be great if there were a "simple" way to tell DirectX::CreateWICTextureFromMemory to behave like D3DX did, though. Either way, thank you for this amazing library and your work around directx tooling in general!
I'm a bit confused. Are you saying the image itself contains alpha and you have to force it to use BGRX instead of BGRA, -or- is there something about rendering with BGRX that is failing vs. BGRA?
Also, the code above could be simplified:
if (metadata.format == DXGI_FORMAT_B8G8R8X8_UNORM) {
metadata.format = DXGI_FORMAT_B8G8R8A8_UNORM;
// todo: this is undefined behaviour, but we must force them to be interpreted as BGRA instead of BGRX
image.OverrideFormat(DXGI_FORMAT_B8G8R8A8_UNORM);
}
Yes, the images comtain alpha even though the metadata indicates differently. Loading some of pictures and checking the metadata shows BGRX as format. Creating a texture from that image renders without alpha, leading to the very blocky looking grass (first image posted). Forcing the metadata format to BGRA instead and then creating a texture from it renders it correctly (second or third images posted).
So I guess it's a difference in how the pixel format is detected, or that D3DX always interprets it as BGRA no matter what the metadata says.
Also, the code above could be simplified:
Thank you!
I'm glad you have a workaround.
I have figured out the likely cause of the issue. The .BMP file you provided is written using the original BITMAPINFOHEADER
header (BITMAPCOREHEADER.bcSize == 40 bytes). There is also a BITMAPV4HEADER
(BITMAPCOREHEADER.bcSize == 108 bytes) and a BITMAPV5HEADER
(BITMAPCOREHEADER.bcSize == 124 bytes).
Officially the BMP "standard" does not support 32-bit BGRA content unless you are using BITMAPV5HEADER
which is a rule the WIC reader enforces. Therefore, the WIC loader is always treating it as BGRX. This is documented here:
Note for 16-bit and 32-bit Windows BMP files, the BMP codec ignores any alpha channel, as many legacy image files contain invalid data in this extra channel. Starting with Windows 8, 32-bit Windows BMP files written using the BITMAPV5HEADER with valid alpha channel content are read as WICPixelFormat32bppBGRA
The two functions do not create equal textures in some cases. Tracking these down, it appears as if
WICTextureLoader9.cpp, line 324
returns
_GUID{0x6fddc324, 0x4e03, 0x4bfe, {0xb1, 0x85, 0x3d, 0x77, 0x76, 0x8d, 0xc9, 0xe}
on some images, which isn't a known GUID (although very close toGUID_WICPixelFormat32bppGrayFloat
). These will be interpreted asB8G8R8X8_UNORM 2D
by texconv while when loading them with D3DX and exporting them to DDS, the result isB8G8R8A8_UNORM 2D
. Changing WICTextureLoader9.cpp toMakes them mostly load correctly, although anti aliasing or something else is still off on some of the resulting textures.
I'll attach a few example images below. The _EXPORTED.dds is the result that you get when calling
on the (with D3DX) loaded texture. The _TEXCONV.dds is the result of
.\texconv.exe -r 0x646CC3F0.bmp -dx9 -y -f BGRA
.d3dxout.zip
It wouldn't make sense to add extra flags to the CreateWICTextureFromMemoryEx to handle such edge cases, but is there a recommended way to deal with cases where it doesn't produce the same results as D3DXCreateTextureFromFileInMemoryEx? I could imagine loading the WIC image and using DirectXTex in some way to correct mistakes and write the file contents back to memory, before passing it to CreateWICTextureFromMemory?