Open fukatani opened 2 years ago
I found this code works fine for me.
with torch.no_grad():
tgt = torch.randint(3, 113, (1, 10, 5), dtype=torch.int32)
traced_model = torch.jit.trace(model, tgt)
tgt_shape = coremltools.Shape(shape=(1, coremltools.RangeDim(1, 15), 5))
coreml_model = coremltools.convert(
traced_model,
inputs=[coremltools.TensorType(shape=tgt_shape, dtype=int)]
)
coreml_model.save("decoder.mlmodel")
Is this expected behavior?
❓Question
I tried this code,
I got error.
When I removed
nn.LayerNorm(256)
, this code was passed. How to fix it? orLayerNorm
does not support flexible size?Environment