lean-dojo / LeanCopilot

LLMs as Copilots for Theorem Proving in Lean
https://leandojo.org
MIT License
852 stars 77 forks source link

Is it possible to bring self model via vllm/ollama/openai api #90

Closed objecti0n closed 1 day ago

objecti0n commented 4 days ago

If it is not possible now, I would like to contribute and support internlm-math-plus-1.8B via vllm/ollama/openai api.

hsz0403 commented 3 days ago

I'm not the author, but I guess #68 means you can bring external usage? Do you want to create an internal API?

Peiyang-Song commented 3 days ago

Hi @objecti0n , thanks for your interest in our work! As @hsz0403 correctly mentioned (thanks!), you may refer to #68 and interact with those API models using Python. You can then use the models in LeanCopilot through https://github.com/lean-dojo/LeanCopilot?tab=readme-ov-file#bring-your-own-model. We welcome contributions! Please feel free to PR for added support of new models. A main entry point should be the python folder in this repo.

hsz0403 commented 3 days ago

I‘m also willing to help to PR this, there's a lot of decoder-only models for LeanCopilot to choose including your great work:) @objecti0n

objecti0n commented 2 days ago

I‘m also willing to help to PR this, there's a lot of decoder-only models for LeanCopilot to choose including your great work:) @objecti0n

The code in this repo only use Transformer for generation which could be hard for deploying large models. I am considering using openai API which can help user to use chatgpt or vllm to deploy any remote and local models.

Peiyang-Song commented 1 day ago

Thank you both for your willingness to contribute! I am converting this issue to a discussion and please feel free to PR. Let me know when things are ready for review or if you encounter any problems.