deeterbleater / deertick

Deertick Agent Management and Integration Toolbox (DAMIT)
MIT License
21 stars 4 forks source link

Error: 'choices' when talking to some models using openrouter #13

Closed prototype99 closed 1 month ago

prototype99 commented 2 months ago

some of the models are unable to function using openrouter as deertick outputs Error: 'choices'. affected models: I-8b I-70b I-405b gemini-flash dolphin codestral

gemini-pro-exp outputs this error whenever they tell suggestive/edgy jokes/go outside their assigned boundaries

prototype99 commented 2 months ago

Pretty sure this implies they're plain incompatible? unless I'm missing something

I-8b

{'error': {'message': 'Model meta/meta-llama-3-8b-instruct is not available', 'code': 400}, 'user_id': 'user_2lVH8sxqxO3P2ybYIiNt43m90pz'} {"model": "meta/meta-llama-3-8b-instruct", "max_tokens": 256, "temperature": 0.6, "presence_penalty": 0, "frequency_penalty": 0, "top_k": 50, "top_p": 0.9, "messages": [{"role": "system", "content": "you are a friend having a conversation with me, my name is sophie"}, {"role": "user", "content": "\nhello\n"}]}

I-70b

{'error': {'message': 'Model meta/meta-llama-3-70b-instruct is not available', 'code': 400}, 'user_id': 'user_2lVH8sxqxO3P2ybYIiNt43m90pz'} {"model": "meta/meta-llama-3-70b-instruct", "max_tokens": 256, "temperature": 0.6, "presence_penalty": 0, "frequency_penalty": 0, "top_k": 50, "top_p": 0.9, "messages": [{"role": "system", "content": "you are a friend having a conversation with me, my name is sophie"}, {"role": "user", "content": "\nhello\n"}]}

I-405b

{'error': {'message': 'Model meta/meta-llama-3.1-405b-instruct is not available', 'code': 400}, 'user_id': 'user_2lVH8sxqxO3P2ybYIiNt43m90pz'} {"model": "meta/meta-llama-3.1-405b-instruct", "max_tokens": 256, "temperature": 0.6, "presence_penalty": 0, "frequency_penalty": 0, "top_k": 50, "top_p": 0.9, "messages": [{"role": "system", "content": "you are a friend having a conversation with me, my name is sophie"}, {"role": "user", "content": "\nhello\n"}]}

gemini-flash

{'error': {'message': 'Model google/gemini-flash-8b-1.5 is not available', 'code': 400}, 'user_id': 'user_2lVH8sxqxO3P2ybYIiNt43m90pz'} {"model": "google/gemini-flash-8b-1.5", "max_tokens": 256, "temperature": 0.6, "presence_penalty": 0, "frequency_penalty": 0, "top_k": 50, "top_p": 0.9, "messages": [{"role": "system", "content": "you are a friend having a conversation with me, my name is sophie"}, {"role": "user", "content": "\nhello\n"}]}

dolphin

{'error': {'message': 'No endpoints found for this model.', 'code': 404}, 'user_id': 'user_2lVH8sxqxO3P2ybYIiNt43m90pz'} {"model": "cognitivecomputations/dolphin-llama-3-70b", "max_tokens": 256, "temperature": 0.6, "presence_penalty": 0, "frequency_penalty": 0, "top_k": 50, "top_p": 0.9, "messages": [{"role": "system", "content": "you are a friend having a conversation with me, my name is sophie"}, {"role": "user", "content": "\nhello\n"}]}

codestral

{'error': {'message': 'Model codestral-latest is not available', 'code': 400}, 'user_id': 'user_2lVH8sxqxO3P2ybYIiNt43m90pz'} {"model": "codestral-latest", "max_tokens": 256, "temperature": 0.6, "presence_penalty": 0, "frequency_penalty": 0, "top_k": 50, "top_p": 0.9, "messages": [{"role": "system", "content": "you are a friend having a conversation with me, my name is sophie"}, {"role": "user", "content": "\nhello\n"}]}

prototype99 commented 1 month ago

on the branch I'm working on, all of these are removed anyway, so this may get closed in the near future