PJLab-ADG / LimSim

LimSim & LimSim++: Integrated traffic and autonomous driving simulators with (M)LLM support
https://pjlab-adg.github.io/LimSim/
GNU General Public License v3.0
344 stars 27 forks source link

Can this simulator run by local ollama model #23

Closed baibingren closed 1 month ago

baibingren commented 3 months ago

i deploy gemma:2b in local by ollama, can i run this simulator by local model

Fdarco commented 3 months ago

Thanks for using LimSim++ and if you have deployed LLM locally, you can refer to our ExampleLLMAgentCloseLoop.py code. In this code, we have built a GPT-4 based driver agent using langchain, you can search how langchain calls the local gemma:2b model and then build your own driver agent following our code. I found the following reference link on the web for you:

github-actions[bot] commented 1 month ago

Stale issue message, no activity