Closed aeneaswiener closed 11 months ago
hi @aeneaswiener - should be possible. I can share a working notebook on using litellm with the text-bison models. I believe there's some tweaks we might need to make to support fine-tuned models, can do it by EOD if you're planning on using litellm soon
@aeneaswiener do you use text-bison through VertexAI ? Happy to get it setup for you.
Mind sharing a code snippet of how you're calling text-bison and your fine-tuned models ? My email is ishaan@berri.ai if you'd prefer sending it privately
It's not super super urgent, but here some code snippets
import vertexai
from vertexai.language_models import TextGenerationModel
vertexai.init(project="XXX", location="us-central1")
parameters = {
"temperature": 0.2,
"max_output_tokens": 256,
"top_p": 0.8,
"top_k": 40
}
model = TextGenerationModel.from_pretrained("text-bison@001")
response = model.predict(
"""""",
**parameters
)
print(f"Response from Model: {response.text}")
for the fine tuned models, it's just one additional line to be inserted before response =
:
model = model.get_tuned_model("projects/XXX/locations/us-central1/models/XXX")
@aeneaswiener added support, this notebook has examples on usage: https://github.com/BerriAI/litellm/blob/main/cookbook/liteLLM_VertextAI_Example.ipynb
Docs: https://docs.litellm.ai/docs/completion/supported#google-vertexai-models
Can you please let us know once you try this
If you run into any issues when testing you can reach me at ishaan@berri.ai or +1 412-618-6238
Incredible, thank you very much!!
As in the title. I know chat-bison is supported, but I'm specifically interested in text-bison, including our own fine-tuned text-bison models. Is or will this be possible?