I'm running an issue when calling trainer.PreviousMinibatchLossAverage() after a TestMinibatch run:
var minibatchDataTest = minibatchTest.GetNextMinibatch(10, device);
var mbData = new UnorderedMapVariableMinibatchData();
mbData.Add(features, minibatchDataTest[featureTestStreamInfo]);
mbData.Add(labels, minibatchDataTest[labelTestStreamInfo]);
var testRes = trainer.TestMinibatch(mbData, device);
Console.WriteLine($"test loss {trainer.PreviousMinibatchLossAverage()}"); // this raises exception
The printed stack is:
[CALL STACK]
> CNTK::Internal:: UseSparseGradientAggregationInDataParallelSGD
- CNTK::Value:: Create
- CNTK::Internal:: UseSparseGradientAggregationInDataParallelSGD
- CNTK::TrainingParameterSchedule<unsigned __int64>:: Transform
- CNTK::Trainer:: PreviousMinibatchLossAverage
- CSharp_CNTK_Trainer_PreviousMinibatchLossAverage
- 00007FFDE9FB695B (SymFromAddr() error: The specified module could not be found.)
I'm not even sure which value it is, since both minibatchDataTest[featureTestStreamInfo] and minibatchDataTest[labelTestStreamInfo] are valid both before and after the TestMinibatch call. (as reported by data.IsValid)
I'm running an issue when calling
trainer.PreviousMinibatchLossAverage()
after aTestMinibatch
run:The printed stack is:
I'm not even sure which value it is, since both
minibatchDataTest[featureTestStreamInfo]
andminibatchDataTest[labelTestStreamInfo]
are valid both before and after theTestMinibatch
call. (as reported bydata.IsValid
)