shunsukesaito / PIFu

This repository contains the code for the paper "PIFu: Pixel-Aligned Implicit Function for High-Resolution Clothed Human Digitization"
https://shunsukesaito.github.io/PIFu/
Other
1.77k stars 340 forks source link

Input Mesh with Vertex Colors #77

Closed tommycwh closed 3 years ago

tommycwh commented 4 years ago

Hi Shunsuke,

I am trying to use your codes with my dataset, which have no uv texture, but has vertex colors provided with the mesh. From some other issues, I see that you have suggested some others having similar datasets to modify your codes to read vertex colors as inputs. However, I have tried to study your codes for a while, but I do not have a clear idea about how to make the modifications. May I ask for some advice about how to load and render data when only vertex colors are provided?

In fact, I have a more specific question to ask, which is about the use of uv mapping in your codes. In lib/data/TrainDataset.py, I see the following codes in the function get_color_sampling(...):

        surface_points = uv_pos[uv_mask]
        surface_colors = uv_render[uv_mask]
        surface_normal = uv_normal[uv_mask]

        if self.num_sample_color:
            sample_list = random.sample(range(0, surface_points.shape[0] - 1), self.num_sample_color)
            surface_points = surface_points[sample_list].T
            surface_colors = surface_colors[sample_list].T
            surface_normal = surface_normal[sample_list].T

        # Samples are around the true surface with an offset
        normal = torch.Tensor(surface_normal).float()
        samples = torch.Tensor(surface_points).float() \
                  + torch.normal(mean=torch.zeros((1, normal.size(1))), std=self.opt.sigma).expand_as(normal) * normal

        # Normalized to [-1, 1]
        rgbs_color = 2.0 * torch.Tensor(surface_colors).float() - 1.0

I see that surface_colors are sampled from uv_render, and I tracked that uv_render (prepared in apps/render_data.py) is acutally colors on the uv mapping. Does this mean that, if I have my vertex colors, to prepare training data, instead of preparing surface_colors using uv mapping, I can prepare my own surface colors, e.g. using interpolation of vertex colors or using some other methods? Or would I find some of your other codes helpful in this situation?

Thank you very much for your amazing work, Shunsuke!

shunsukesaito commented 3 years ago

You have to modify the following codes to change inputs to the OpenGL renderer. For training the color module, yes you can instead use per-vertex color, position, and surface normal. Unless your per-vertex color is too sparse, probably you don't need to run additional barycentric interpolation etc.

https://github.com/shunsukesaito/PIFu/blob/master/lib/renderer/gl/prt_render.py https://github.com/shunsukesaito/PIFu/blob/master/lib/renderer/gl/data/prt.vs https://github.com/shunsukesaito/PIFu/blob/master/lib/renderer/gl/data/prt.fs