fastmachinelearning / hls4ml

Machine learning on FPGAs using HLS
https://fastmachinelearning.org/hls4ml
Apache License 2.0
1.3k stars 419 forks source link

update keras activation parsing, especially leaky relu #1085

Closed jmitrevs closed 1 month ago

jmitrevs commented 1 month ago

Description

Issue #1076 showed that hls4ml didn't parse leaky_relu passed as a parameter (activation='leaky_relu') in Keras properly. One could argue that it was not explicitly described in v2 of the API documentation, but it was in the code, and it is officially described in v3 of the API, so we should support it.

As an aside, I don't think we support relu with an alpha parameter properly--maybe I should add a check to change relu to leaky_relu in that case.

I noticed also that the Activation layer handler was not called for activations passed as arguments. Special cases had to be handled in two places, once in the Activation layer handler, once in the activations passed as arguments section. This I believe is error-prone, and elu passed as an argument would break the oneAPI backend if I don't duplicate code. I think it is better to unify and always call the Activation layer handler, so that is what I did instead.

Type of change

Tests

I added a leaky_relu passed as argument test, and it's also important not to break the other tests.

Checklist

jmitrevs commented 1 month ago

I also added support for the new v3 API parameter names for leaky relu.

jmitrevs commented 1 month ago

I want to understand the need for the assert change, so putting it to draft for now.