Open palandovalex opened 4 months ago
That's just the architecture of the library does not allow you to make such changes so easily. Although it turned out to be quite possible to make getters, but setters turn out to be useless when the instance is initialized ... You will have to do a little crutch option:
def build_keyroller(key_pref) -> Callable[[object], Optional[str]]:
key_tuple_name:str = key_pref + '_keys'
key_number_name:str = key_pref + '_key_num'
def roll_key(self) -> Optional[str]:
keys = self.__getattribute__(key_tuple_name)
if not keys:
return None
number = self.__getattribute__(key_number_name)
number += 1
if number >= len(keys):
number = 0
self.__setattr__(key_number_name, number)
return keys[number]
return roll_key
@dataclass
class Credentials:
# openai config
openai_auth_type: str = "bearer_token"
_openai_keys: Tuple = ()
_openai_key_num: int = 0
get_openai_key = build_keyroller('_openai')
openai_key: Optional[str] = None
# azure config
azure_auth_type: str = "api_key"
_azure_keys: Tuple = ()
_azure_key_num: int = 0
get_azure_key = build_keyroller('_azure')
asure_key: Optional[str] = None
openllm_auth_type: Optional[str] = None
_openllm_keys: Tuple = ()
_openllm_key_num: int = 0
get_openllm_key = build_keyroller('_openllm')
openai_key: Optional[str] = None
Is your feature request related to a problem? Please describe. I had rait limit error. I think, that if requests of memgpt will be with different tokens - тhis can prevent openai from failing to process requests.
Describe the solution you'd like
I store my keys in csv file. In my chatbot i load keys, and manage them with simple "keyroller". After request it change key, and it prevents rate limit errors.
Describe alternatives you've considered Nothing at my opinion