1M tokens input
1.5 Pro can process large amounts of data at once, including 1 hour of video, 9.5 hours of audio, codebases with over 30,000 lines of code or over 700,000 words.
But later part of the page covers 2M
AND
[*] A token is equivalent to about 4 characters for Gemini models. 100 tokens are about 60-80 English words.
Display summary from my own calcs e.g. cost for GPT4 for so many pages. using live calc.
Add info from sources below in my calcs and output measures
https://docs.anthropic.com/en/docs/about-claude/models
https://ai.google.dev/gemini-api/docs/models/gemini#token-size
1M tokens input 1.5 Pro can process large amounts of data at once, including 1 hour of video, 9.5 hours of audio, codebases with over 30,000 lines of code or over 700,000 words.
But later part of the page covers 2M
AND