Open rholinshead opened 6 months ago
The code for the completion params are defined in here: https://github.com/google/generative-ai-python/blob/7698c69d7e694b1da9a516db6b843553cec1bc58/google/generativeai/text.py#L132-L144
In general, these how how model parsers work overall:
config.run()
is called, selecting a specific model parser: https://github.com/lastmile-ai/aiconfig/blob/4055228242640e3ee4bbd73817f927c493bf05b1/python/src/aiconfig/Config.py#L263-L279palm.generate_text()
as the API call, passing in completion_data
: https://github.com/lastmile-ai/aiconfig/blob/4055228242640e3ee4bbd73817f927c493bf05b1/python/src/aiconfig/default_parsers/palm.py#L136This task specifically involves changing the supported completion params to now also match the ones that the code supports, not just what is listed on the API docs, which is out of sync with the code: https://cloud.google.com/vertex-ai/docs/generative-ai/start/quickstarts/api-quickstart#request_body
You can test this by running the cookbook we have for PaLM and making sure it still runs as expected.
Part 2 is you should also update the typescript library for this in https://github.com/lastmile-ai/aiconfig/blob/4055228242640e3ee4bbd73817f927c493bf05b1/typescript/lib/parsers/palm.ts#L207C17-L207C43
I can give you more details on how to test this later, but for now I think it's good skill to learn how to figure out how to do it on your own, so I would try doing this:
Going to assign this to @Victor-Su-Ortiz
Currently,
PaLMTextParser
callsrefine_chat_completion_params
to deserialize model settings from the Prompt.refine_completion_params
is defined but unused. Note thatrefine_completion_params
's implementation is identical torefine_chat_completion_params
and is likely incorrect since it defines the following as supported keys:Despite palm's
generate_text
method supporting the following: