SSRSGJYD / NeuralTexture

Unofficial implementation of the paper "Deferred Neural Rendering: Image Synthesis using Neural Textures" in Pytorch.
265 stars 36 forks source link

Creating UV input data #11

Closed davidvfx07 closed 2 years ago

davidvfx07 commented 2 years ago

I've looked all over the internet for how to simply render a UV map or "texel-to-pixel mapping" and have found nothing. As mentioned in the README, OpenGL_NeuralTexture is an option, but I am hesitant to use that repo because it's kinda messy, most of it is hardcoded and inflexible/unusable, and I couldn't even compile it on Windows.

Is there another way to get the UV input data?

davidvfx07 commented 2 years ago

I've managed to get UV data from Blender by rendering a UV pass and converting the RG channels to NPY.

AnanthK1998 commented 1 year ago

hi, could you share a script for this? Thanks in advance. email: ananth1544@gmail.com

davidvfx07 commented 1 year ago

The following script should work for converting images to NPY files, though I recommend a fork by DanAmador which allows for the use of PNG files. As for rendering a UV pass, I will differ you to issue #6 where I posted a script for rendering DECA faces to uv maps.

import os
import glob
from numpy import save
from matplotlib.image import imread

in_dir = "Input"
out_dir = "Output"

os.makedirs(out_dir, exist_ok=True)

sequence = glob.glob(os.path.join(in_dir, "*"))

for i in range(len(sequence)):
    npy = imread(sequence[i])

    save(os.path.join(out_dir, f"{i:04d}.npy"), npy)
AnanthK1998 commented 1 year ago

Hey, thanks. What I was looking for is, given a single OBJ file of an object. I want to render in multiple views, and generate UV images for those views as well. I am not using a sequence of objs. Thanks a lot in advance.