Open Hacetate opened 3 weeks ago
Which .data
file are you trying to load? Its size 10000
seems a bit weird to me.
The Raw .data
of USC-Hairsalon dataset all have strand of around 10,000
Oh I see, the original data you downloaded from their website right? These data contain some irregular strands which only have one point. You can download our processed data from this OneDrive link. We filtered those strands and aligned them to our head mesh.
Thanks for the resampled original data. Could you please provide the code for preprocessing irregular data? I found that the dataset of the original USC-hairsalon
is all 10000x300
. However, the data from Hairnet
interpolation contains a large number of strands greater than 300
, while others range from 100 to 200.
To obtain a better training dataset, it should not be sufficient to simply truncate strand above 300
and delete strand less than 300
.
When I load the
.data
usingio,py
,I encountered this error:strands = np.array(strands).reshape((num_strands, -1,3)) ValueError: cannot reshape array of size 10000 into shape (10000,newaxis,3)
. The reason for this is that numpy reads an irregular list and append to strands list, which cannot convert to a reshapable Numpy Array. So what I need to do is zeroize or something else?