Riga2 / NSRD

[CVPR 2024] Neural Super-Resolution for Real-time Rendering with Radiance Demodulation
MIT License
27 stars 3 forks source link

Obtaining and saving the data #6

Open Sodiride123 opened 4 months ago

Sodiride123 commented 4 months ago

Hello Riga2,

I am quite interested in your model here and wanna reproduce it. I have few questions regarding the obtaining the data set.

Question: I have downloaded your dataset and looked through them. For the training set, for the data like G-buffer(Depth, Normal) and Motion Vector, how did you obtain and save those data locally? Did you write script in the shader or use any extension in the UE to acquire those data?

Riga2 commented 4 months ago

Hello Riga2,

I am quite interested in your model here and wanna reproduce it. I have few questions regarding the obtaining the data set.

Question: I have downloaded your dataset and looked through them. For the training set, for the data like G-buffer(Depth, Normal) and Motion Vector, how did you obtain and save those data locally? Did you write script in the shader or use any extension in the UE to acquire those data?

Hi, thank you for your interset!

We wrote custom shaders and scripts to generate the data in Unity Engine.

Sodiride123 commented 4 months ago

Thanks for your reply. I have one more question regarding the stage of obtaining data. I understand that the images are rendered at the low resolution. How about the material component? 1). I saw you have both LR and HR material components. Did you obtain the LR material component by down sampling the HR material components, or by calculating LR auxiliary buffers separately? 2). Did you obtain the HR material components by involving calculating HR auxiliary buffers?

Riga2 commented 4 months ago

Thanks for your reply. I have one more question regarding the stage of obtaining data. I understand that the images are rendered at the low resolution. How about the material component? 1). I saw you have both LR and HR material components. Did you obtain the LR material component by down sampling the HR material components, or by calculating LR auxiliary buffers separately? 2). Did you obtain the HR material components by involving calculating HR auxiliary buffers?

Sorry for the late reply. 1) We calculated LR auxiliary buffers separately. 2) Yes. We use the BRDF pre-computation described in the supplementary file to obtatin the HR material components.

Sodiride123 commented 4 months ago

Thanks for your reply. So, you obtained both LR and HR g-buffers. I knew the logic of doing these, but won't this consume quite a lot bandwidth and computing power? Is that acceptable in the real-time rendering?

Riga2 commented 3 months ago

Thanks for your reply. So, you obtained both LR and HR g-buffers. I knew the logic of doing these, but won't this consume quite a lot bandwidth and computing power? Is that acceptable in the real-time rendering?

Hi, Compared with directly rendering full-resolution images, our method only increases bandwidth consumption on the LR Gbuffers. Since our method primarily focuses on PC, the increased bandwidth consumption is acceptable in our experiments. However, our method may not be suitable for mobile devices, which are bandwidth-sensitive.

Sodiride123 commented 3 months ago

I've been thinking about a question. Sorry to disturb frequently , I'm new to this field. When using either the the UE or Unity, we need to obtain both high-resolution and low-resolution G-buffer information for the scene, which means adjusting the export resolution settings is compulsory. How can this be achieved during real-time rendering?