This code crashes without output in Colab (s4tf v0.10) though it compiles and runs locally on macOS:
import TensorFlow
public struct Network: Layer {
var blocks: [Conv2D<Float>] = []
public init() {
self.blocks += [Conv2D(filterShape: (3, 3, 16, 16))]
}
@differentiable
public func callAsFunction(_ input: Tensor<Float>) -> Tensor<Float> {
blocks.differentiableReduce(input) { $1($0) }
}
}
If I don't append with += (just assign blocks to an array or use blocks.append) there is no crash. Also if I just call blocks[0](input) in callAsFunction there is no crash.
The Colab runtime log message before the restarting is a warning:
I came to this reduced error from a larger block of code that worked in v0.9 but is now crashing. So appending with += and calling differentiableReduce was not crashing in that version.
This code crashes without output in Colab (s4tf v0.10) though it compiles and runs locally on macOS:
If I don't append with
+=
(just assignblocks
to an array or useblocks.append
) there is no crash. Also if I just callblocks[0](input)
in callAsFunction there is no crash.The Colab runtime log message before the restarting is a warning:
I came to this reduced error from a larger block of code that worked in v0.9 but is now crashing. So appending with
+=
and callingdifferentiableReduce
was not crashing in that version.