minimaxir / simpleaichat

Python package for easily interfacing with chat apps, with robust features and minimal code complexity.
MIT License
3.47k stars 229 forks source link

Roles on Messages #13

Closed GabCores closed 1 year ago

GabCores commented 1 year ago

Hi , I have a prompt that have been defined as SYSTEM , USER and ASSISTANCE roles . In the OpenAI playground any alteration of the content for each role, give undesired results . Was not able to find how to configure roles in this library May you help me ?

Nice job !

minimaxir commented 1 year ago

The documentation isn't there yet, but you can manually set messages starting a conversation:

from simpleaichat import AIChat
from simpleaichat.models import ChatMessage

message1 = ChatMessage(role="user", content="Do a thing.")
message2 = ChatMessage(role="assistant", content="I did a thing.")

ai = AIChat(console=False)
ai.default_session.messages = [message1, message2]

Unclear if that answers your question. Doing anything with roles beyond that is not recommended.

GabCores commented 1 year ago

Ok ! I was searching for that . waiting the best . regards !

keyboardAnt commented 1 year ago

The documentation isn't there yet, but you can manually set messages starting a conversation:

from simpleaichat import AIChat
from simpleaichat.models import ChatMessage

message1 = ChatMessage(role="user", content="Do a thing.")
message2 = ChatMessage(role="assistant", content="I did a thing.")

ai = AIChat(console=False)
ai.default_session.messages = [message1, message2]

Unclear if that answers your question. Doing anything with roles beyond that is not recommended.

Is there a convenient way to print prompts before they are sent to the API? Thanks!

minimaxir commented 1 year ago

@keyboardAnt What use case is there for that? The only thing really sent to the API is what is provided to the prompt, wrapped as a ChatMessage as above, along with previous messages.

keyboardAnt commented 1 year ago

@keyboardAnt What use case is there for that? The only thing really sent to the API is what is provided to the prompt, wrapped as a ChatMessage as above, along with previous messages.

Thanks for replying! To answer your question, as a new user, I would like to know what precisely the library sends to the API. My question is about getting this complete visibility, especially on prompts, which is crucial for me. In addition (or instead) adding code, another solution could be mentioning your comment above in the docs for clarification.

minimaxir commented 1 year ago

For debugging you can use prepare_request() which constructs the data to be sent to the ChatGPT API, e.g.

ai.default_session.prepare_request(prompt)[1]

Which returns the full set of data used as model parameters.

That should get covered in more formal documentation. Once a full logging/debugging flow is added I may add that to it.

This is getting off topic, so closing.