PJLab-ADG / LimSim

LimSim & LimSim++: Integrated traffic and autonomous driving simulators with (M)LLM support
https://pjlab-adg.github.io/LimSim/
GNU General Public License v3.0
417 stars 35 forks source link

Can this simulator run by local ollama model #23

Closed baibingren closed 6 months ago

baibingren commented 8 months ago

i deploy gemma:2b in local by ollama, can i run this simulator by local model

Fdarco commented 8 months ago

Thanks for using LimSim++ and if you have deployed LLM locally, you can refer to our ExampleLLMAgentCloseLoop.py code. In this code, we have built a GPT-4 based driver agent using langchain, you can search how langchain calls the local gemma:2b model and then build your own driver agent following our code. I found the following reference link on the web for you:

github-actions[bot] commented 6 months ago

Stale issue message, no activity