deepjavalibrary / djl

An Engine-Agnostic Deep Learning Framework in Java
https://djl.ai
Apache License 2.0
4.12k stars 654 forks source link

Layer Normalization #1057

Closed enpasos closed 3 years ago

enpasos commented 3 years ago

Description

I would like to use Layer Normalization just the way I use Batch Normalization in DJL. What would be the best way to get there?

Layer Norm References

zachgk commented 3 years ago

Right now, DJL doesn't have an API to access the LayerNorm. But, the underlying engine you are using should have it implemented. If you call the engine directly, you can still access their implementation.

You can use BatchNorm as an example of what that looks like or find some more self contained examples in the engine's implementation of the NDArray interface. Which engine are you using?

enpasos commented 3 years ago

Right now I am using PyTorch. I will follow your advice: Use BatchNorm as an example. Would it be an option to integrate it into DJL if it comes as a pull request?

zachgk commented 3 years ago

Would it be an option to integrate it into DJL if it comes as a pull request?

Absolutely!