Open julcst opened 1 month ago
I found a solution, just changed the write
-function:
/// Write `data` to the texture.
///
/// - `data`: 32-bit RGBA bitmap data.
/// - `width`: The width of the source bitmap (`data`) in pixels.
/// - `height`: The height of the source bitmap (`data`) in pixels.
pub fn write(&self, queue: &Queue, data: &[u8], width: u32, height: u32) {
match &self.texture.format() {
TextureFormat::Rgba16Float => {
// Convert each byte in data from u8 (0-255) to f16 (0.0-1.0).
let data: Vec<u16> = data
.iter()
.map(|&byte| (half::f16::from_f32(byte as f32 / 255.0)).to_bits())
.collect();
queue.write_texture(
// destination (sub)texture
ImageCopyTexture {
texture: &self.texture,
mip_level: 0,
origin: Origin3d { x: 0, y: 0, z: 0 },
aspect: TextureAspect::All,
},
// source bitmap data
bytemuck::cast_slice(&data),
// layout of the source bitmap
ImageDataLayout {
offset: 0,
bytes_per_row: Some(width * 8),
rows_per_image: Some(height),
},
// size of the source bitmap
Extent3d {
width,
height,
depth_or_array_layers: 1,
},
);
}
_ => {
queue.write_texture(
// destination (sub)texture
ImageCopyTexture {
texture: &self.texture,
mip_level: 0,
origin: Origin3d { x: 0, y: 0, z: 0 },
aspect: TextureAspect::All,
},
// source bitmap data
data,
// layout of the source bitmap
ImageDataLayout {
offset: 0,
bytes_per_row: Some(width * 4),
rows_per_image: Some(height),
},
// size of the source bitmap
Extent3d {
width,
height,
depth_or_array_layers: 1,
},
);
}
}
}
I tested this on macOS and was able to succesfully render ImGui to an HDR texture.
Ok instead of just converting the data I made the font atlas texture and surface texture different, this solution is way cleaner:
diff --git a/src/lib.rs b/src/lib.rs
index 992856c..1757e61 100644
--- a/src/lib.rs
+++ b/src/lib.rs
@@ -277,6 +277,7 @@ impl Texture {
pub struct RendererConfig<'s> {
pub texture_format: TextureFormat,
pub depth_format: Option<TextureFormat>,
+ pub font_atlas_format: Option<TextureFormat>,
pub sample_count: u32,
pub shader: Option<ShaderModuleDescriptor<'s>>,
pub vertex_shader_entry_point: Option<&'s str>,
@@ -289,6 +290,7 @@ impl<'s> RendererConfig<'s> {
RendererConfig {
texture_format: TextureFormat::Rgba8Unorm,
depth_format: None,
+ font_atlas_format: None,
sample_count: 1,
shader: Some(shader),
vertex_shader_entry_point: Some(VS_ENTRY_POINT),
@@ -362,6 +364,7 @@ impl Renderer {
let RendererConfig {
texture_format,
depth_format,
+ font_atlas_format,
sample_count,
shader,
vertex_shader_entry_point,
@@ -505,6 +508,7 @@ impl Renderer {
config: RendererConfig {
texture_format,
depth_format,
+ font_atlas_format,
sample_count,
shader: None,
vertex_shader_entry_point: None,
@@ -785,6 +789,7 @@ impl Renderer {
height: handle.height,
..Default::default()
},
+ format: self.config.font_atlas_format,
..Default::default()
};
In this fork I have a working version and also updated wgpu to 22.1.0: https://github.com/julcst/imgui-wgpu-rs.git.
I wondered though currently the default for the Texture::format
is renderer.config.texture_format
, wouldn't it be better if the defaults were simply wgpu::TextureFormat::Bgra8UnormSrgb
and vec![wgpu::TextureFormat::Bgra8Unorm]
because this is what almost everyone will want:
/// Create a new GPU texture width the specified `config`.
pub fn new(device: &Device, renderer: &Renderer, config: TextureConfig) -> Self {
// Create the wgpu texture.
let texture = Arc::new(device.create_texture(&TextureDescriptor {
label: config.label,
size: config.size,
mip_level_count: config.mip_level_count,
sample_count: config.sample_count,
dimension: config.dimension,
format: config.format.unwrap_or(renderer.config.texture_format),
usage: config.usage,
view_formats: &[config.format.unwrap_or(renderer.config.texture_format)],
}));
...
}
Description
When building a
RendererConfig
withtexture_format: wgpu::TextureFormat::Rgba16Float
(to output to HDR):wgpu panics:
The panic occurs in this method:
I am using: