Closed ishaan-jaff closed 3 months ago
import os import litellm os.environ['CODESTRAL_API_KEY'] response = await litellm.atext_completion( model="text-completion-codestral/codestral-2405", prompt="def is_odd(n): \n return n % 2 == 1 \ndef test_is_odd():", suffix="return True", # optional temperature=0, # optional top_p=1, # optional max_tokens=10, # optional min_tokens=10, # optional seed=10, # optional stop=["return"], # optional )
import os import litellm os.environ['CODESTRAL_API_KEY'] response = await litellm.acompletion( model="codestral/codestral-latest", messages=[ { "role": "user", "content": "Hey, how's it going?", } ], temperature=0.0, # optional top_p=1, # optional max_tokens=10, # optional safe_prompt=False, # optional seed=12, # optional )
๐ New Feature ๐ Bug Fix ๐งน Refactoring ๐ Documentation ๐ Infrastructure โ Test
If UI changes, send a screenshot/GIF of working UI fixes
The latest updates on your projects. Learn more about Vercel for Git โ๏ธ
Title
FIM/Completions
Chat Completions
Relevant issues
Type
๐ New Feature ๐ Bug Fix ๐งน Refactoring ๐ Documentation ๐ Infrastructure โ Test
Changes
[REQUIRED] Testing - Attach a screenshot of any new tests passing locall
If UI changes, send a screenshot/GIF of working UI fixes