Open dsaha21 opened 5 days ago
Currently, our platform supports only NumPy. While we may expand to include other libraries in the future, for now, all our implementations are exclusively built with NumPy.
Okay sure :+1:
Then I am closing the issue for now. Will try to add other problems with numpy. Thanks a lot @moe18
Hi @moe18, I have converted the problem from tensorflow to pure numpy will be able to contribute for the Positional encoding of Transformer inputs
Let me know if I can open a PR.
I want to add a easy to medium question on NLP category related to Transformers positional encoding upon the input embeddings. The question will be done using only
tensorflow as tf, tf.cast(), tf.concat(), tf.math.sin() and tf.math.cos()
. If possible, I might open a PRIf that goes succesfull, we might add more transformer blocks :
Thank you