tijiang13 / InstantAvatar

333 stars 23 forks source link

textured mesh export #39

Closed artprosvetov closed 11 months ago

artprosvetov commented 11 months ago

Thank you for your project! Currently I am trying to export textured mesh (see https://github.com/tijiang13/InstantAvatar/issues/15, https://github.com/tijiang13/InstantAvatar/issues/5). There is an easy way to obtain texture of extracted mesh?

Thank you in advance for clarification!

artprosvetov commented 11 months ago

Finally, the mesh was extracted, but there are many holes in it. Can you propose a way to obtain a smooth mesh? 1B178285-4069-4467-8C57-2C9615A2906A

tijiang13 commented 11 months ago

Hi artprosvetov,

This is more or less expected as NeRF is not renowned for great geometry. The main problem is that the level set for the surface is not clearly defined in the context of marching cubes. If a mesh with smoother surface is desired I'd suggest try to switch from NeRF to SDF representation as what we have done in V2A

Best, Tianjian

herybala commented 11 months ago

How did you extract it?

herybala commented 11 months ago

@artprosvetov can you share how you achieved it? thanks

artprosvetov commented 11 months ago

The code is the following:

from hydra import initialize, compose
from omegaconf import OmegaConf
from skimage import measure
import trimesh
from tqdm import tqdm
import torch
import hydra

with initialize(config_path="confs"):
    cfg = compose(config_name='demo.yaml')

datamodule = hydra.utils.instantiate(cfg.dataset, _recursive_=False)
model = hydra.utils.instantiate(cfg.model, datamodule=datamodule, _recursive_=False)
model.eval()

checkpoint = torch.load('outputs/peoplesnapshot/demo/male-3-casual/checkpoints/last.ckpt')
model.load_state_dict(checkpoint["state_dict"])

resolution=256
treshhold=42
gradient_direction="ascent"
extract_max_component=False
device="cuda"
with torch.no_grad():
    idx = torch.arange(0, resolution)
    coords = torch.meshgrid((idx, idx, idx), indexing="ij")
    coords = torch.stack(coords, dim=-1).cuda()
    coords = coords.reshape(-1, 3) / resolution

    bbox = [0, resolution]

    coords = coords * (bbox[1] - bbox[0]) + bbox[0]

    val = []
    for b in tqdm(coords.split(2**20)):
         val.append(model.net_coarse.encoder(b.cuda()/256.0)[..., 0])

    val = torch.cat(val, dim=0)
    val = val.reshape(resolution, resolution, resolution)
    val = val.cpu().detach().numpy()

    verts, faces, normals, _ = measure.marching_cubes(
         val.transpose(1, 0, 2), treshhold, gradient_direction='ascent')

    mesh = trimesh.Trimesh(verts, faces, normals)
    connected_comp = mesh.split(only_watertight=False)
    max_area = 0
    max_comp = None
    for comp in connected_comp:
        if comp.area > max_area:
            max_area = comp.area
            max_comp = comp

    max_comp.export('export_mesh.ply')

In some cases the extracted mesh is a noise due to noisy watertight artefacts with high area. In this case just we can find next mesh in connected_comp with max area:

import numpy as np
area_list = [A.area for A in connected_comp]
pre_max_comp_list = (np.argsort(area_list)[-10:])
connected_comp[pre_max_comp_list[-2]].export('export_mesh.ply')
tijiang13 commented 11 months ago

Thank your for your response, i have one last question, does bbox parameter affect the results because with some data i only get one leg as mesh file

Hi Herybala,

As far as I can see, the main reason is that in the script of artprosvetov, the threshold for marching cube is set to 42 which is usually too high. A number between 0.1 and 1 is probably a better choice in general.

Best, Tianjian

tijiang13 commented 11 months ago

As mentioned earlier, the level set of NeRF lacks a well-defined surface, and marching cubes is THE WRONG WAY to extract the mesh for NeRF model. The reason is obvious: marching cubes ignores any space where density != threshold, disregarding the positive density contributions essential for the final volumetric rendering.

If you still insist on using marching cubes despite knowing it's not the correct approach, here is the code and demo for that purpose:

And the snippet for it

import glob
import os
import torch
import pytorch_lightning as pl
import hydra
from instant_avatar.utils.marching_cubes import marching_cubes
from instant_avatar.deformers.snarf_deformer import SNARFDeformer

@hydra.main(config_path="./confs", config_name="SNARF_NGP")
def main(opt):
    pl.seed_everything(opt.seed)
    torch.set_printoptions(precision=6)
    print(f"Switch to {os.getcwd()}")

    datamodule = hydra.utils.instantiate(opt.dataset, _recursive_=False)
    model = hydra.utils.instantiate(opt.model, datamodule=datamodule, _recursive_=False)
    model = model.cuda()
    model.eval()

    checkpoints = sorted(glob.glob("checkpoints/*.ckpt"))
    print("Resume from", checkpoints[-1])
    checkpoint = torch.load(checkpoints[-1])
    model.load_state_dict(checkpoint["state_dict"])
    model.eval()

    c = model.net_coarse.center.clone()
    s = model.net_coarse.scale.clone()
    if isinstance(model.deformer, SNARFDeformer):
        s[2] /= 4
    bbox = torch.stack([c - s / 2, c + s / 2], dim=0)

    mesh = marching_cubes(
        lambda x: model.net_coarse(x, None)[-1],
        gradient_direction="ascent",
        bbox=bbox,
        level_set=0.01,
        resolution=512,
    )

    vertices = mesh.vertices
    vertices = torch.from_numpy(vertices).cuda()
    color = model.net_coarse(vertices, None)[0]
    mesh.visual.vertex_colors = color.detach().cpu().numpy()[..., ::-1]
    mesh.export(f"mesh.ply")

if __name__ == "__main__":
    main()

@herybala @artprosvetov

artprosvetov commented 11 months ago

Thank you for the code! Unfortunately I can't reproduce your result: my colours are black. What could be the reason? 74CA7227-2B17-42F7-86E2-36BF505B9A72

tijiang13 commented 11 months ago

Thank you for the code! Unfortunately I can't reproduce your result: my colours are black. What could be the reason? 74CA7227-2B17-42F7-86E2-36BF505B9A72

You need to pull the new commits as there are some changes in the marching cube implementation.

Best, Tianjian

artprosvetov commented 11 months ago

Thank you!

MilesTheProwler commented 11 months ago

Hi can you show me the example of final obj file ?