Closed Hec-gitHub closed 2 days ago
Modify LLM_MODEL
and the corresponding endpoint url and api_key in the .env
file to the model you want, if it's not supported, you'll need to mock up and write some files yourself.
Modify
LLM_MODEL
and the corresponding endpoint url and api_key in the.env
file to the model you want, if it's not supported, you'll need to mock up and write some files yourself.
I understand now. By examining the implementation content of '/DB-GPT/dbgpt/storage/knowledge_graph/knowledge_graph.py', it is indeed necessary to customize some implementation files to support proxy LLM.
This issue has been marked as stale
, because it has been over 30 days without any activity.
This issue bas been closed, because it has been marked as stale
and there has been no activity for over 7 days.
Search before asking
Description
Can proxy LLM be supported when creating a knowledge base using graph database. What code files need to be modified when using graph databases based on proxy LLM. For example: /DB-GPT/dbgpt/storage/knowledge_graph/knowledge_graph.py
Use case
When using a graph database to query content, you can freely switch between proxy LLM.
Related issues
nothing
Feature Priority
None
Are you willing to submit PR?