Closed atambay37 closed 3 months ago
Intro to LLMs: a large amount of data (almost the entire internet), attention-based architecture, transformers, needs a lot of computing, requires millions of $, and not everyone can train them.
One way to interact with LLMs is prompts
Common LLM parameters: temperature, top-k, etc.
Prompting Techniques: Zero-shot, few-shot, and maybe another technique like chain-of-thought or ReAct with OLMo
According to our original plan, #39 comes after this module, but I think #39 can overlap with LLM parameters / prompting techniques section in this module.