Open SeaJungg opened 1 year ago
Hello :) How do you think about the feature that choose the embedding option? I know the best-fit-way to query with chatgpt-retieval-plugin is using openai embedding api. But due to degraded performance on korean language, I made vector db with kobert sentence transformers(which is not in open ai engine) (ref : https://github.com/SKTBrain/KoBERT). Naturally I also should make query sentence into embeddings with same model at local. There is demo code : Even I didn't design the feature to select an embedding model, I just designed it to import a local embedding model, but I hope you guys understand my intentions... main...SeaJungg:chatgpt-retrieval-plugin:ko-embedding
Have a good day!
Hi SeaJungg! I have the same question as you that I want to use a local model to generate embeddings. Did your demo code work fine for you? Also, running smoothly with chatgpt plugin? Thank you!
Hello :) How do you think about the feature that choose the embedding option? I know the best-fit-way to query with chatgpt-retieval-plugin is using openai embedding api. But due to degraded performance on korean language, I made vector db with kobert sentence transformers(which is not in open ai engine) (ref : https://github.com/SKTBrain/KoBERT). Naturally I also should make query sentence into embeddings with same model at local. There is demo code : Even I didn't design the feature to select an embedding model, I just designed it to import a local embedding model, but I hope you guys understand my intentions... https://github.com/openai/chatgpt-retrieval-plugin/compare/main...SeaJungg:chatgpt-retrieval-plugin:ko-embedding
Have a good day!