Open xivind opened 2 days ago
Yes, you can provide inputs for both the user
and system
roles, but it’s available only in the Chat Completions API (which uses models like gpt-4
or gpt-3.5-turbo
), not in the older Completions API (which uses models like text-davinci-003
).
Here's how the roles work in the Chat Completions API:
system
: Used to set the behavior or persona of the assistant. This acts as a guiding instruction for the assistant throughout the conversation.user
: Represents the input or query from the user.assistant
: Contains previous responses from the assistant, used when continuing a conversation or maintaining context.system
and user
RolesIn your case, you can define both a system
message and a user
query within the messages
parameter. Here's how to modify your generate_response
function:
async def generate_response(query: str, context: str) -> AsyncGenerator[str, None]:
"""
Generate a streaming response using the OpenAI Chat Completions API
"""
messages = [
{"role": "system", "content": "You are an expert on digitalization in Norway's healthcare and welfare sector. Provide guidance based on laws, regulations, and best practices."},
{"role": "user", "content": f"Context: {context}\n\nQuestion: {query}"}
]
response = await client.chat.completions.create(
model=settings.model_name, # E.g., "gpt-4"
messages=messages,
temperature=settings.temperature,
max_tokens=settings.max_tokens,
stream=True
)
async for chunk in response:
content = chunk.choices[0].delta.get('content')
if content:
yield content
Chat Completions API (with gpt-4
or gpt-3.5-turbo
):
system
, user
, and assistant
.messages
parameter.Completions API (with text-davinci-003
, etc.):
user
input) is supported.system
-like behavior, prepend instructions manually to the prompt.Example:
prompt = f"""You are an expert on digitalization in Norway's healthcare and welfare sector.
Provide guidance based on laws, regulations, and best practices.
Context: {context}
Question: {query}
"""
This isn't as robust as the system
role in the Chat Completions API, as it doesn’t distinguish between instructions and the user’s query.
assistant
Role?The assistant
role is optional and used when continuing a multi-turn conversation. For example:
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is digitalization?"},
{"role": "assistant", "content": "Digitalization is the process of using digital technologies to transform business processes."},
{"role": "user", "content": "How does it apply in healthcare?"}
]
This lets the model "remember" its previous response when formulating the next answer.
system
and user
roles.assistant
role is only needed for multi-turn conversations when the model needs prior context from its own responses.
Se prompt engineering guide for eksempler: https://platform.openai.com/docs/guides/prompt-engineering
Vurder å skille prompten etter roller: system og user