Closed KasaiHarcore closed 5 months ago
Hi @KasaiHarcore I just tested the llama3 70B, removed the llama 2 instance and noted down that context lengths may be an issue. Thanks for the contribution!
No problem, thanks for your reply. Keep up the good work 😍
Vào Th 7, 8 thg 6, 2024 vào lúc 17:44 Martin Mirchev < @.***> đã viết:
Hi @KasaiHarcore https://github.com/KasaiHarcore I just tested the llama3 70B, removed the llama 2 instance and noted down that context lengths may be an issue. Thanks for the contribution!
— Reply to this email directly, view it on GitHub https://github.com/nus-apr/auto-code-rover/pull/39#issuecomment-2155984617, or unsubscribe https://github.com/notifications/unsubscribe-auth/A3M2LFAHFSLCS7YZNJINW5DZGLOBHAVCNFSM6AAAAABHKS6RO6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJVHE4DINRRG4 . You are receiving this because you were mentioned.Message ID: @.***>
I don't know if llama2-70b-4096 is still supported. Doc of groq says it's not, while litellm says otherwise.
Can you confirm if it's supported by running this code? I can't do it myself because I don't have a key.