Loss evaluated with a separate function:
let lossFn = mi.Func<single> (loss) |> arg2 input target
differs from the loss evaluated during training:
let trainFn () = Train.train trainable fullDataset trainCfg
When thisoption ist set:
SymTensor.Debug.DisableCombineIntoElementsOptimization <- true
the results are consistent again!
Tested via DeepPrivate:
Z:\DeepPrivate\****Models\bin\Release\****Models.exe Z:\DEVELOP\DeepPrivate\****Models\cfgs\COPYING\LSTM\80reluOutputTest\Config.fsx
Loss evaluated with a separate function:
let lossFn = mi.Func<single> (loss) |> arg2 input target
differs from the loss evaluated during training:
let trainFn () = Train.train trainable fullDataset trainCfg
When thisoption ist set:
SymTensor.Debug.DisableCombineIntoElementsOptimization <- true
the results are consistent again!Tested via DeepPrivate:
Z:\DeepPrivate\****Models\bin\Release\****Models.exe Z:\DEVELOP\DeepPrivate\****Models\cfgs\COPYING\LSTM\80reluOutputTest\Config.fsx
Code during training gets optimized differently?
Tested with [band 2277e5e]