This app is a template for using LangChain to build a LLM Q+A assistant from any set of YouTube videos.
We use Karpathy's course on LLMs as an example.
We use LangChain to:
(1) convert YouTube urls to text
(2) feed the text into LangChain auto-evaluator to test different chain parameters
(3) with our chosen parameters, build a vectorstore retriever back-end with FastAPI (deployed to Railway)
(4) stream the generated results (answer and retrieved docs) to a front-end (deployed to Vercel)
See the notebook in /index
folder:
OpenAIWhisperParser
to convert urls to text in < 10 lines of codeSee the text files in /eval
folder:
See the notebook in /index
folder:
Pinecone
) with metadataSee the karpathy_app.py
file in /api
folder:
load_qa_chain
with a user specified LLM and prompt (see default_prompt_template
)/api
for local testing instructionsSee /nextjs
directory for nextJS app:
uvicorn karpathy_app:app
fetchEventSource
here and here to http://localhost:8000/karpathy-docs
and http://localhost:8000/karpathy-stream
npm run dev