Open-Deep-ML / DML-OpenProblem

Other
180 stars 48 forks source link

New problem : Positional Encoding for Tranformers input #165

Open dsaha21 opened 5 days ago

dsaha21 commented 5 days ago

I want to add a easy to medium question on NLP category related to Transformers positional encoding upon the input embeddings. The question will be done using only tensorflow as tf, tf.cast(), tf.concat(), tf.math.sin() and tf.math.cos(). If possible, I might open a PR

If that goes succesfull, we might add more transformer blocks :

  1. Self attention
  2. Multihead attention
  3. Encoder
  4. Decoder
  5. Full Transformer - > N x Encoder and Decoder blocks (hard level question)(also if all segments are performed we can think of adding this)

Thank you

moe18 commented 5 days ago

Currently, our platform supports only NumPy. While we may expand to include other libraries in the future, for now, all our implementations are exclusively built with NumPy.

dsaha21 commented 5 days ago

Okay sure :+1:

Then I am closing the issue for now. Will try to add other problems with numpy. Thanks a lot @moe18

dsaha21 commented 4 days ago

Hi @moe18, I have converted the problem from tensorflow to pure numpy will be able to contribute for the Positional encoding of Transformer inputs Let me know if I can open a PR.