soohoonc / llms

0 stars 1 forks source link

add section on history before llms #4

Open soohoonc opened 4 months ago

soohoonc commented 4 months ago

add sections on rnn, lstm, gru, vae, etc used before transformers

greptile-apps[bot] commented 4 months ago

To address the issue of adding a section on the history of models used before transformers, such as RNN, LSTM, GRU, VAE, etc., you should update the models.ipynb file in the soohoonc/llms repository. Here are the specific steps to implement this resolution:

  1. Add a New Section Before Transformer Models: At the beginning of the notebook, before discussing Transformer based models, insert a new markdown cell titled 'History of Machine Learning Models'.

  2. Detail Each Model Type: In separate markdown cells under the new section, provide detailed explanations for each of the following model types:

    • RNN (Recurrent Neural Networks): Explain the basic concept, use cases, and limitations.
    • LSTM (Long Short-Term Memory): Discuss its architecture, how it solves the vanishing gradient problem, and its applications.
    • GRU (Gated Recurrent Unit): Outline its structure, advantages over LSTM, and typical use cases.
    • VAE (Variational Autoencoders): Describe the model's framework, its significance in generating new data, and applications.
  3. Provide Examples and References: Where possible, include code snippets or pseudocode to illustrate how these models are implemented. Also, add references to seminal papers or resources for further reading.

This addition will enrich the notebook by providing a comprehensive background on the evolution of machine learning models leading up to the current focus on transformer-based models.

References

soohoonc/llms/models.ipynb

Ask Greptile

sakshitposting-irl commented 4 months ago

I would like to work on this issue