Closed netandreus closed 8 months ago
There were several issues:
extractor.messages.embedding
in my config.yamldimensions
. Correct for my case is 384, according to my localAI local model.text-embedding-ada-002
is hardcoded somewhere in zep. I have created an issue-276.Here is my correct config.yaml file:
llm:
service: "openai"
model: "gpt-3.5-turbo-1106"
openai_endpoint: "http://host.docker.internal:8080/v1"
nlp:
server_url: "http://host.docker.internal:5557"
memory:
message_window: 12
extractors:
documents:
embeddings:
enabled: true
chunk_size: 200
dimensions: 384
service: "openai"
messages:
embeddings:
enabled: true
chunk_size: 200
dimensions: 384
service: "openai"
summarizer:
enabled: true
entities:
enabled: true
embeddings:
enabled: true
chunk_size: 200
dimensions: 384
service: "openai"
entities:
enabled: true
intent:
enabled: true
store:
type: "postgres"
postgres:
dsn: "postgres://postgres:postgres@localhost:5433/?sslmode=disable"
embeddings:
enabled: true
dimensions: 384
model: "openai"
server:
host: 0.0.0.0
port: 8000
web_enabled: true
max_request_size: 5242880
auth:
# Set to true to enable authentication
required: false
# Do not use this secret in production. The ZEP_AUTH_SECRET environment variable should be
# set to a cryptographically secure secret. See the Zep docs for details.
secret: "do-not-use-this-secret-in-production"
data:
# PurgeEvery is the period between hard deletes, in minutes.
# If set to 0 or undefined, hard deletes will not be performed.
purge_every: 60
log:
level: "debug"
opentelemetry:
enabled: false
endpoint:
attributes: {}
# Custom Prompts Configuration
# Allows customization of extractor prompts.
custom_prompts:
summarizer_prompts:
# Anthropic Guidelines:
# - Use XML-style tags like <current_summary> as element identifiers.
# - Include {{.PrevSummary}} and {{.MessagesJoined}} as template variables.
# - Clearly explain model instructions, e.g., "Review content inside <current_summary></current_summary> tags".
# - Provide a clear example within the prompt.
#
# Example format:
# anthropic: |
# <YOUR INSTRUCTIONS HERE>
# <example>
# <PROVIDE AN EXAMPLE>
# </example>
# <current_summary>{{.PrevSummary}}</current_summary>
# <new_lines>{{.MessagesJoined}}</new_lines>
# Response without preamble.
#
# If left empty, the default Anthropic summary prompt from zep/pkg/extractors/prompts.go will be used.
anthropic: |
# OpenAI summarizer prompt configuration.
# Guidelines:
# - Include {{.PrevSummary}} and {{.MessagesJoined}} as template variables.
# - Provide a clear example within the prompt.
#
# Example format:
# openai: |
# <YOUR INSTRUCTIONS HERE>
# Example:
# <PROVIDE AN EXAMPLE>
# Current summary: {{.PrevSummary}}
# New lines of conversation: {{.MessagesJoined}}
# New summary:`
#
# If left empty, the default OpenAI summary prompt from zep/pkg/extractors/prompts.go will be used.
openai: |
Review the Current Content, if there is one, and the New Lines of the provided conversation. Create a concise summary
of the conversation, adding from the New Lines to the Current summary.
If the New Lines are meaningless, return the Current Content.
Current summary:
{{.PrevSummary}}
New lines of conversation:
{{.MessagesJoined}}
New summary:
Also localAI should have bowh models:
Describe the bug I am trying to use zep with LocalAI. Thanks for resolve issue-270. Now there are now errors, but metadata still empty.
To Reproduce Here is my chat:
And session in Zep:
What should I do for enable metadata and Zep as short-term memmory?
Expected behavior Metadata for messages.
Logs
Environment (please complete the following information):
docker compose
Note: The Zep server version is available in the Zep server logs at startup:
Starting zep server version 0.11.0-cbf4fe4 (2023-08-30T12:49:03+0000)
Additional context
config.yaml
docker-compose.yaml