zengyh1900 / 3D-Human-Body-Shape

[ICIMCS'2017] Official Code for 3D Human Body Reshaping with Anthropometric Modeling
https://doi.org/10.1007/978-981-10-8530-7_10
MIT License
340 stars 105 forks source link

How to generate the "normals.npy" #13

Closed xqyd closed 5 years ago

xqyd commented 5 years ago

Hi, great works! After playing with the demo. I want to train several SPRING models for a better understanding. I also realized that most of the *.npy files have to be manually added into the obj2npy function which will call several pre-defined functions.

However, for instance, if I have 10 SPRING models in the obj/female folder. Which one should I employ to compute the normals? Say the "vertex" in obj2npy is of shape (len(file_list), V_NUM, 3), but the one required by the compute_normals(vertex, facet) appears to be of shape (V_NUM, 3).

Here's the code for the revised obj2npy:

def obj2npy(label="female"): print(' [**] begin obj2npy about %s... '%label) start = time.time() OBJ_DIR = os.path.join(DATA_DIR, "obj") obj_file_dir = os.path.join(OBJ_DIR, label) file_list = os.listdir(obj_file_dir)

# load original data
vertex = []
for i, obj in enumerate(file_list):
    sys.stdout.write('\r>> Converting %s body %d'%(label, i))
    sys.stdout.flush()
    f = open(os.path.join(obj_file_dir, obj), 'r')
    j = 0
    for line in f:
        if line[0] == '#':
            continue
        elif "v " in line:
            line.replace('\n', ' ')
            tmp = list(map(float, line[1:].split()))
            vertex.append(tmp)
            j += 1
        else:
            break

facet = convert_template()

vertex = np.array(vertex).reshape(len(file_list), V_NUM, 3)

normals = compute_normals(vertex[0], facet)
np.save(open(os.path.join(DATA_DIR, "normals.npy"), "wb"), normals)
print("finish compute normals")

for i in range(len(file_list)):
    v_mean = np.mean(vertex[i,:,:], axis=0)
    vertex[i,:,:] -= v_mean
mean_vertex = np.array(vertex.mean(axis=0)).reshape(V_NUM, 3)
std_vertex = np.array(vertex.std(axis=0)).reshape(V_NUM, 3)

facet = np.load(open(os.path.join(DATA_DIR, "facet.npy"),"rb"))
# loading deform-based data
[d_inv_mean, deform, mean_deform, std_deform] = load_d_data(vertex, facet, label)
# calculating deform-based presentation(PCA)
[d_basis, d_coeff, d_pca_mean, d_pca_std] = get_d_basis(deform, label)#female_d_coeff.npy

np.save(open(os.path.join(DATA_DIR, "%s_vertex.npy"%label), "wb"), vertex)
np.save(open(os.path.join(DATA_DIR, "%s_mean_vertex.npy"%label), "wb"), mean_vertex)
np.save(open(os.path.join(DATA_DIR, "%s_std_vertex.npy"%label), "wb"), std_vertex)

#Adding extra stuff ...
cp = convert_cp()
[measure, mean_measure, std_measure, t_measure] = convert_measure(cp, vertex, facet, label) #生成%s_measure.npy

# calculate global mapping from measure to deformation PCA coeff
get_m2d(d_coeff, t_measure, label)  # "%s_m2d.npy"

# cosntruct the related matrix A to change deformation into vertex
get_d2v_matrix(d_inv_mean, facet, label)# %s_d2v.npz

#mask = np.load(open(os.path.join(DATA_DIR, "mask.npy"), "rb"))
# get color dict & mask
[PART, mask] = get_map(facet)
# local map matrix: measure->deform
local_matrix(mask, deform, measure, label)#"%s_local.npy"

print('\n [**] finish obj2npy in %fs.' %(time.time() - start))
return [vertex, mean_vertex, std_vertex, file_list]

However, the normals.npy generated in this way will cause an error in reshaper.py, line 182 return [x, -self.normals, self.facets - 1] TypeError: bad operand type for unary -: 'map'

If I use your original normals.npy , I can pass this and generate 100 new models with code in line 217. However, even if I am using your normals.npy, and replaced the rest of the *.npy files with my training result, the demo.py will not deliver the correct result. image

As you can see, I am getting a meat ball for setting the female body as weight == 65 kg and height == 166cm.

Any idea how to fix this? Tons of thanks in advance~~~!!!

xqyd commented 5 years ago

By "line 182 and line 217", I mean the reshaper.py

xqyd commented 5 years ago

Well, after clearing my mind. The value in the original normals.npy is actually of shape (37500,), which means the returned value from compute_normals(...) should be changed accordingly. I tried mean_vertex as the input, and the generated normals.npy can be passed into reshaper.py successfully.

As for the meatball problem. Simply because the training data is too sparse. Once I've increased the number of models into 100, the demo.py can deliver correct result as depicted in the original paper.

Again, great works!