Closed shilpan97 closed 3 years ago
formula: f(x) = alpha * x for x < 0 , f(x) = x for x >= 0
Merging #63 (17b1baa) into master (2ff7e3b) will decrease coverage by
0.20%
. The diff coverage is20.00%
.:exclamation: Current head 17b1baa differs from pull request most recent head 284427b. Consider uploading reports for the commit 284427b to get more accurate results
@@ Coverage Diff @@
## master #63 +/- ##
==========================================
- Coverage 95.63% 95.42% -0.21%
==========================================
Files 30 30
Lines 1808 1813 +5
==========================================
+ Hits 1729 1730 +1
- Misses 79 83 +4
Impacted Files | Coverage Δ | |
---|---|---|
tflite2onnx/op/activation.py | 92.77% <20.00%> (-4.67%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 2ff7e3b...284427b. Read the comment docs.
@shilpan97 You can take a look at this PR https://github.com/jackwish/tflite2onnx/pull/25/files which implements the Prelu
which should be very similar to your work. To be noted, the code is this PR may be different with what we have now.
hey, wasn't sure if my changes are correct(please treat as initial attempt). When trying to run on my machine I got the following error message: AttributeError: type object 'ActivationFunctionType' has no attribute 'LEAKY_RELU'. Not sure how to fix it.
Please feel free to edit directly. Thanks!