Closed ghost closed 3 years ago
Hi MaxC2, thanks for your interest in our work. Yes, you're right the order in the call is incorrect, but in functionality, it will work the same. I guess the right way would be to swap it before training the model, but for using the pre-trained model, you need to keep the order as it is.
In any case, I am working on a PyTorch version of the same code (since the version of TensorFlow used here is all but defunct now) and will be releasing it soon.
Yes you're right, it's just because i've seen that by reading the code today. Great news for the PyTorch version !
in
modules_tf.py
:method signature:
def full_network(f0, phos, singer_label, is_train):
in
model.py
:methods calls:
self.output = modules.full_network(self.phone_onehot_labels, self.f0_placeholder, self.singer_onehot_labels, self.is_train)
I think the
self.phone_onehot_labels
(Pho_in) andself.f0_placeholder
(F0_in) need to be swapped in the method calls to be coherent with the signature (or change the signature order). No problem in the model functionnality, the layers have a similar construction signature and are interchangeable in the concatenated input tensor.