Closed shaansuraj closed 1 month ago
🙌 Thank you for bringing this issue to our attention! We appreciate your input and will investigate it as soon as possible.
Hey I found this issue interesting. Could you assign this issue to me.
@Abhicoder03 I am glad that you found this topic interesting. I have assigned it to you. If you have any doubts, you can reach me out through the email I provided above.
Hi @shaansuraj I’m interested in contributing to the Build a Generatively Pretrained Transformer (GPT) project. I plan to focus on:
Understanding Self-Attention Mechanisms, I will thoroughly study the "Attention is All You Need" paper and the architecture of OpenAI's GPT-2 to grasp how self-attention enables efficient text generation. I will work with the Shakespeare Dataset to learn and emulate the unique language patterns of the playwright, enhancing the model's text generation capabilities. I will implement the GPT model, ensuring it generates coherent and contextually relevant text based on the context of previous words.
@JahnaviDhanaSri Okay sure, I will assign you to this issue as well. You can work collaboratively with @Abhicoder03 on this.
Thank You!
On Wed, 2 Oct 2024 at 22:17, Suraj Sahu @.***> wrote:
Assigned #31 https://github.com/UTSAVS26/PyVerse/issues/31 to @JahnaviDhanaSri https://github.com/JahnaviDhanaSri.
— Reply to this email directly, view it on GitHub https://github.com/UTSAVS26/PyVerse/issues/31#event-14492075246, or unsubscribe https://github.com/notifications/unsubscribe-auth/BCKE3GS7THMNH7UBNIVWWQ3ZZQPTNAVCNFSM6AAAAABPHWSRGCVHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJUGQ4TEMBXGUZDINQ . You are receiving this because you were assigned.Message ID: @.***>
@JahnaviDhanaSri Hey, we can collaborate on this project.
Sure!
On Wed, 2 Oct 2024 at 22:58, Abhishek Choudhary @.***> wrote:
@JahnaviDhanaSri https://github.com/JahnaviDhanaSri Hey, we can collaborate on this project.
— Reply to this email directly, view it on GitHub https://github.com/UTSAVS26/PyVerse/issues/31#issuecomment-2389225786, or unsubscribe https://github.com/notifications/unsubscribe-auth/BCKE3GVMJD6PTCYDNRKDBQTZZQUNNAVCNFSM6AAAAABPHWSRGCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGOBZGIZDKNZYGY . You are receiving this because you were mentioned.Message ID: @.***>
✅ This issue has been closed. Thank you for your contribution! If you have any further questions or issues, feel free to reach out!
:red_circle: Title : Build a Generatively Pretrained Transformer (GPT) referring to "Attention is All You Need", OpenAI GPT-2, and Shakespeare Dataset :red_circle: Aim : Generating the next word based on the context of the previous ones. :red_circle: Brief Explanation : Make use of Self Attention mechanisms to process and generate text efficiently, without relying on recurrent neural networks. You have to refer to "Attention is All You Need" paper and learn about GPT models, like OpenAI's GPT-2, which are pretrained on large text corpora, learning to predict the next word in a sentence, which allows them to generate coherent and contextually relevant text. You can start with Shakespeare's Dataset, the model learns the patterns specific to the playwright’s language, enabling it to generate text in a similar style. This combination of large-scale pretraining and fine-tuning on domain-specific data makes GPT highly effective for creative text generation tasks. This will be your first step towards Generative AI.
Here are the links for your help:
Attention is All You Need: https://arxiv.org/abs/1706.03762 Shakespeare Plays Dataset: https://www.kaggle.com/datasets/kingburrito666/shakespeare-plays
For any issues you can reach me out via email: suraj@brandladder.co.in
Screenshots 📷
:white_check_mark: To be Mentioned while taking the issue :
Happy Contributing 🚀
All the best. Enjoy your open source journey ahead. 😎