Closed ghost closed 1 year ago
Running with MPS
I can take a look tomorrow. One way to help: You can use DynamicGraph.logLevel = .verbose
to see output on each layer. This will help you to locate at which layer the divergence happen.
Also, it might be your input is too irregular and some internal normalization layer is not happy with that (try x.randn() instead, rather than x.full(1)).
Also, you can use debugPrint(out)
to pretty print the tensor into terminal.
Thanks a lot! I tried random also, still its non deterministic
Also, this bug is in the NCHW branch
You surely meant NHWC? Otherwise the shape doesn't make sense.
sorry i mean NHWC, but i can reproduce this in master also
You can repro using this patch
Run the binary 5-6 times and you will see inconstancy
Any idea what could be causing this issue?
run the process 5-6 times, you should see atleast one where the output is something like