Closed root-zheng closed 4 years ago
I also have the same problem, the demo cannot separate even a little.I use the Timit training set.
I also have the same problem, the demo cannot separate even a little.I use the Timit training set.
may I ask that what did you do to solve this problem? I also have the same problem,the demo cannot separate even a little.I use the Timit training set.
Thanks very much for your code! That really help my a lot. But when I use my dataset, I find that my batch loss cannot decrease. And the demo cannot separate even a little. Hope you can reply me! Thanks much for your time!
Thanks very much for your code! That really help my a lot. But when I use my dataset, I find that my batch loss cannot decrease. And the demo cannot separate even a little. Hope you can reply me! Thanks much for your time!
may I ask that what did you do to solve this problem? I also have the same problem,the demo cannot separate even a little.I use the Timit training set.
I remember I rewrite the whole model, especially the loss function. The loss function in this code is complicated and hard to check mistakes.
Thanks very much for your code! That really help my a lot. But when I use my dataset, I find that my batch loss cannot decrease. And the demo cannot separate even a little. Hope you can reply me! Thanks much for your time!
may I ask that what did you do to solve this problem? I also have the same problem,the demo cannot separate even a little.I use the Timit training set.
I remember I rewrite the whole model, especially the loss function. The loss function in this code is complicated and hard to check mistakes.
Thank you for your reply! I am new to the deep learning... so it's so difficult to me to rewrite the model...emmmm could you please show your code to me? thank you very much! and how about the result of your rewrite code?
Thanks very much for your code! That really help my a lot. But when I use my dataset, I find that my batch loss cannot decrease. And the demo cannot separate even a little. Hope you can reply me! Thanks much for your time! may I ask that what did you do to solve this problem? I also have the same problem,the demo cannot separate even a little.I use the Timit training set. I remember I rewrite the whole model, especially the loss function. The loss function in this code is complicated and hard to check mistakes.
Thank you for your reply! I am new to the deep learning... so it's so difficult to me to rewrite the model...emmmm could you please show your code to me? thank you very much! and how about the result of your rewrite code?
Sorry, I have signed a confidentiality agreement, so I cannot send the code to you. And the result of my code is also not very good.
Thank you again!发自我的华为手机-------- 原始邮件 --------主题:Re: [khaotik/DaNet-Tensorflow] Loss cannot decrease (#11)发件人:root-zheng 收件人:khaotik/DaNet-Tensorflow 抄送:Yangjie55 ,Comment
Thanks very much for your code! That really help my a lot. But when I use my dataset, I find that my batch loss cannot decrease. And the demo cannot separate even a little. Hope you can reply me! Thanks much for your time! may I ask that what did you do to solve this problem? I also have the same problem,the demo cannot separate even a little.I use the Timit training set. I remember I rewrite the whole model, especially the loss function. The loss function in this code is complicated and hard to check mistakes.
Thank you for your reply! I am new to the deep learning... so it's so difficult to me to rewrite the model...emmmm could you please show your code to me? thank you very much! and how about the result of your rewrite code?
Sorry, I have signed a confidentiality agreement, so I cannot send the code to you. And the result of my code is also not very good.
—You are receiving this because you commented.Reply to this email directly, view it on GitHub, or mute the thread.
Thanks very much for your code! That really help my a lot. But when I use my dataset, I find that my batch loss cannot decrease. And the demo cannot separate even a little. Hope you can reply me! Thanks much for your time!