OpenRobotLab / GRUtopia

GRUtopia: Dream General Robots in a City at Scale
https://grutopia.github.io
MIT License
443 stars 19 forks source link

NPC in local? #9

Closed mialana closed 1 month ago

mialana commented 1 month ago

您好。I am curious if there are plans to eventually make NPC interaction available in the local GRUtopia environment? Thanks!

Boyu-Mi commented 1 month ago

Thanks for your interest. The code of LLM inference is at https://github.com/OpenRobotLab/GRUtopia/blob/6e20ccdff67faa4a485c93528b925829477a6ffb/grutopia/npc/llm_caller.py#L82 You can modify the api_base_url if you are using an local llm server or replace it with inference code of your local LLM. Feel free to reach out if you have other questions.

mialana commented 1 month ago

Ok, thank you. And if I want to host the WebUI without using Docker, is there a quick way to do this? I have Isaac SIM locally.

uzuku commented 1 month ago

And if I want to host the WebUI without using Docker, is there a quick way to do this?

@mialana Currently there's no quick way to realize that, since nginx is required to serve the WebUI frontend, and we believe that setting it up with docker is a simpler way for most of the researchers. You can refer to Dockerfile for the the settings if a local nginx is more suitable for you. And you can start the WebUI backend by

python ./webui/main.py

By the way, the WebUI process in container is able to communicate to locally installed Isaac Sim, but some modification is necessary to make WebRTC work properly:

https://github.com/OpenRobotLab/GRUtopia/blob/6e20ccdff67faa4a485c93528b925829477a6ffb/Dockerfile#L16-L18

You can do it against the locally installed Isaac Sim, and then start WebUI process only in the docker container. They should be able to work with each other.