Closed bhavsarpratik closed 3 years ago
I can take care of this - I'm working on some other refactoring of transformer encoders anyways.
I can take care of this - I'm working on some other refactoring of transformer encoders anyways.
That would be very cool. What refactoring are u working on?
I can take care of this - I'm working on some other refactoring of transformer encoders anyways.
That would be very cool. What refactoring are u working on?
What I described in #1291, some minor changes in documentation to synchronise with transformers package, and implementing reduce operations in native pytorch/tf - as this can lead to large speedups on GPU - same as in #399
And I am also considering adding AMP, for pytorch at least - see #1283
closed by #1344
This is not done in TFTransformerEncoder
right?
Ohh yes!
@tadej-redstone Would you be interested in doing similar refactoring for TransformerTFEncoder
? It shouldn't take much time.
Yes, I already have a branch ready doing this, I'm waiting first till #1527 is fixed
On Tue, Dec 22, 2020, 1:30 PM Pratik Bhavsar notifications@github.com wrote:
@tadej-redstone https://github.com/tadej-redstone Would you be interested in doing similar refactoring for TransformerTFEncoder? It shouldn't take much time.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/jina-ai/jina-hub/issues/1296#issuecomment-749517309, or unsubscribe https://github.com/notifications/unsubscribe-auth/AQUQGD5MC75MYC5FHQIYVB3SWCGPRANCNFSM4U4RWXEA .
Refactor TransformerTorch and TransformerTF tests with a matrix of parameters by using
pytest parametrize
. The matrix should contain a few popular models worth testing with differentpooling
andlayer_index