Closed objecti0n closed 1 day ago
I'm not the author, but I guess #68 means you can bring external usage? Do you want to create an internal API?
Hi @objecti0n , thanks for your interest in our work! As @hsz0403 correctly mentioned (thanks!), you may refer to #68 and interact with those API models using Python. You can then use the models in LeanCopilot through https://github.com/lean-dojo/LeanCopilot?tab=readme-ov-file#bring-your-own-model. We welcome contributions! Please feel free to PR for added support of new models. A main entry point should be the python
folder in this repo.
I‘m also willing to help to PR this, there's a lot of decoder-only models for LeanCopilot to choose including your great work:) @objecti0n
I‘m also willing to help to PR this, there's a lot of decoder-only models for LeanCopilot to choose including your great work:) @objecti0n
The code in this repo only use Transformer for generation which could be hard for deploying large models. I am considering using openai API which can help user to use chatgpt or vllm to deploy any remote and local models.
Thank you both for your willingness to contribute! I am converting this issue to a discussion and please feel free to PR. Let me know when things are ready for review or if you encounter any problems.
If it is not possible now, I would like to contribute and support internlm-math-plus-1.8B via vllm/ollama/openai api.