Closed mz24cn closed 7 years ago
Sorry, I am unable to resolve conflicts.
Hi @mz24cn, you need to merge locally in order to resolve the conflict (BTW, the pull request will be updated automatically when you push new commits, that means you don't need to create a new pull request if you modify the code)
https://github.com/dmlc/mxnet/pull/3485 leads to all examples (except mlp) in MXNet.CPP, such as alexnet.cpp, googlenet.cpp, etc., including charRNN.cpp, are INVALID now. MXOptimizerFree, MXOptimizerFindCreator and other two C API were removed. So, how to port the changes to MXNet.CPP to keep the backward compatibility?
@lx75249 @piiswrong
@mz24cn I'm porting Optimizer to mxnet.cpp, I guess it will be done in 2 days.
Not only Optimizer was removed, Operator are renamed, "_Plus" disappeared. NNVM breaks many industry code, besides introduces an unclear future. MXNet.cpp needs carefully investigate totally on the influence by NNVM, if it is not abandoned. @lx75249 @piiswrong
appveyor error message indicates libmxnet.lib file should be upgraded.
Hi @lx75249, I have done the windows verification work. Besides the "#include
Now charRNN.cpp can properly run on Windows and Linux. Please consider merge this pull request. Thank you.
It seems the encoding and line ending of many files changed. Please clean them up.
Two remained problems: 1, Ugly warings occur: 2, The performance is slightly downgraded from 240s/eporch to 260s/eporch.
Hi @lx75249 , I cleaned up files and added data default to kNullOp. Now only 9 files updated.
okay, it's done.
there are still lint errors, please fix them by checking https://travis-ci.org/dmlc/MXNet.cpp/jobs/191323175#L465
now only charRNN.cpp tabs issue left, which is you allowed. thanks
Lint check still complains (https://travis-ci.org/dmlc/MXNet.cpp/jobs/191593609#L898), like line too long and braces issues. I think you can also change tabs to spaces by the way, so that I don't need to add special configs for your example. (You can see the checks in the end of this page after pushing changes, and when all the checks succeed, I will merge this pr)
you see?
Official team member doesn't agree to make data default to be kNullOp (https://github.com/dmlc/MXNet.cpp/pull/54), I have to modify: Executor exe = RNN.SimpleBind(device, args_map); to be Executor exe = RNN.SimpleBind(device, args_map, {}, {{"data", kNullOp}});
So the SimpleBind can not be so simple, for majority of ML tasks.