Closed disktnk closed 5 years ago
Merging #202 into master will increase coverage by
0.18%
. The diff coverage is97.05%
.
@@ Coverage Diff @@
## master #202 +/- ##
==========================================
+ Coverage 89.69% 89.88% +0.18%
==========================================
Files 24 24
Lines 1417 1473 +56
==========================================
+ Hits 1271 1324 +53
- Misses 146 149 +3
Impacted Files | Coverage Δ | |
---|---|---|
onnx_chainer/graph.py | 95.65% <100%> (ø) |
:arrow_up: |
onnx_chainer/export.py | 91.47% <97.01%> (+1.47%) |
:arrow_up: |
onnx_chainer/functions/normalization.py | 86.44% <0%> (-6.78%) |
:arrow_down: |
onnx_chainer/functions/math.py | 85.71% <0%> (-0.54%) |
:arrow_down: |
onnx_chainer/functions/__init__.py | 100% <0%> (ø) |
:arrow_up: |
onnx_chainer/mapping.py | 90% <0%> (ø) |
:arrow_up: |
onnx_chainer/functions/pooling.py | 88.73% <0%> (+5.59%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 88994fb...8148f6e. Read the comment docs.
/test
Successfully created a job for commit 0a2d8ab:
Sorry if I'm confused, but this does not work for the following case?
def test_implicit_input_value(self):
class Model(chainer.Chain):
def __init__(self):
super(Model, self).__init__()
self.frac = np.array(4, dtype=np.float32)
def forward(self, x):
return x / self.frac
x = chainer.Variable(np.array(1, dtype=np.float32))
self.expect(Model(), x, name='implicit_input_value')
Good catch. If inputs are ndarray type, function node converts them to chainer.Variable
within itself, and will be lost out of function node (apply
). I fixed it.
/test
Successfully created a job for commit 8148f6e:
fixes #201
Then, second value's reference is lost when convert to graph, so converter cannot set the value into ONNX graph .
To keep such a value reference, exporter retains input values in forwarding using
LinkHook
and patchedapply
. By introducingLinkHook
, the hook function only retains temporary values to reduce memory usage, but the target model is required to useforward
function, not__call__
function.export_implicit_inputs_public
option, because converter cannot distinguish the temporary value is covered by converter or not. For example,beta
of batch-normalization is sometimesNone
but followed by converter.