Customize user_default_llm in service_conf.yaml:factory: 'OpenAI'api_key: MY_API_KEY
Build Docker images and start up the server
Log in to RAGFlow
Set up knowledge base with gpt-4o as the embedding model
Upload file to knowledge base and start parsing
SUCCESS in parsing status but shows an embedding error:
[ERROR]Embedding error:Error code: 403 - {'error': {'message': 'You are not allowed to generate embeddings from this model', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Tried again after configuring OpenAI API key on the Model Providers page. The same error happened.
Expected behavior
Succesful parsing.
Steps to reproduce
See Actual behavior.
Additional information
Error message:
[ERROR]Embedding error:Error code: 403 - {'error': {'message': 'You are not allowed to generate embeddings from this model', 'type': 'invalid_request_error', 'param': None, 'code': None}}
I've verified that my OpenAI API key works and has credit.
Is there an existing issue for the same bug?
Branch name
main
Commit ID
unknown
Other environment information
Actual behavior
user_default_llm
inservice_conf.yaml:
factory: 'OpenAI'
api_key: MY_API_KEY
SUCCESS
in parsing status but shows an embedding error:[ERROR]Embedding error:Error code: 403 - {'error': {'message': 'You are not allowed to generate embeddings from this model', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Tried again after configuring OpenAI API key on the Model Providers page. The same error happened.
Expected behavior
Succesful parsing.
Steps to reproduce
Additional information
Error message:
[ERROR]Embedding error:Error code: 403 - {'error': {'message': 'You are not allowed to generate embeddings from this model', 'type': 'invalid_request_error', 'param': None, 'code': None}}
I've verified that my OpenAI API key works and has credit.