Transformers are building blocks of ongoing LLM wave. Purpose of this talk is to make audience aware of transformer architecture and key ideas that were introduced with it.
Encoder decoder architecture
Self Attention
Cross attention
Multi-head attention
Positional encoding
Layer Normalisation
Residual connections
Position wise feed forward networks
Pre-requisites & reading material
Basic awareness of neural networks is a must.
Time required for the talk
45r min
Link to slides/demos
To be shared.
About you
Ankush is a independent researcher with areas of interest including NLP on scholarly articles, Personalized summarization, LLMs analysis.
Title
Introduction to Transformers
Describe your Talk
Transformers are building blocks of ongoing LLM wave. Purpose of this talk is to make audience aware of transformer architecture and key ideas that were introduced with it. Encoder decoder architecture
Pre-requisites & reading material
Basic awareness of neural networks is a must.
Time required for the talk
45r min
Link to slides/demos
To be shared.
About you
Ankush is a independent researcher with areas of interest including NLP on scholarly articles, Personalized summarization, LLMs analysis.
Availability
13/01/2024
Any comments
No response