Dsobh / explainable_ros

2 stars 1 forks source link

Issue between explainable_ros and llama_ros #4

Open kyle-redyeti opened 3 weeks ago

kyle-redyeti commented 3 weeks ago

I was super excited to find this repo! I seem to be having a bit of an issue running this. It looks like an issue with launching llama_cpp.

I am running it on an Jetson Xavier in a ROS Humble Docker Container. The only oddity I saw in building was two warnings about the method used to install python packages but I THINK it all built correctly (This was for the ExplainableROS build. I have been having issue with running just Llama ROS https://github.com/mgonzs13/llama_ros by itself as well so potentially it is just an issue with the newest version of that project.

This was the only thing in the log which seems to be the same that was output to the screen:

1717876327.2159364 [INFO] [launch]: All log files can be found below /root/.ros/ log/2024-06-08-19-52-07-179907-5c48896c5719-4909 1717876327.2197232 [INFO] [launch]: Default logging verbosity is set to INFO 1717876328.8688183 [ERROR] [launch]: Caught exception in launch (see debug for t raceback): create_llama_launch() got an unexpected keyword argument 'stop'

root@5c48896c5719:~/ros2_ws# ros2 launch explicability_bringup explicability_ros.launch.py [INFO] [launch]: All log files can be found below /root/.ros/log/2024-06-08-19-52-07-179907-5c48896c5719-4909 [INFO] [launch]: Default logging verbosity is set to INFO [ERROR] [launch]: Caught exception in launch (see debug for traceback): create_llama_launch() got an unexpected keyword argument 'stop'

mgonzs13 commented 3 weeks ago

Hey @kyle-redyeti, I have updated the create_llama_launch for the new versions of llama_ros. Btw, are you using CUDA inside the docker of the Jetson Xavier to run llama_ros?

kyle-redyeti commented 3 weeks ago

That is my intention... I don't remember if I had the correct parameters in my docker run command but the container was supposed to have the correct packages. I will try to give it a try later today!

On Sun, Jun 9, 2024, 12:40 PM Miguel Ángel González Santamarta < @.***> wrote:

Hey @kyle-redyeti https://github.com/kyle-redyeti, I have updated the create_llama_launch for the new versions of llama_ros. Btw, are you using CUDA inside the docker of the Jetson Xavier?

— Reply to this email directly, view it on GitHub https://github.com/Dsobh/explainable_ros/issues/4#issuecomment-2156713537, or unsubscribe https://github.com/notifications/unsubscribe-auth/AF2TVXCCX7OUACE7SL4EAWTZGSHQVAVCNFSM6AAAAABJAHRSWWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNJWG4YTGNJTG4 . You are receiving this because you were mentioned.Message ID: @.***>

kyle-redyeti commented 2 weeks ago

It looks like I am getting a different error. It is an error I have seen before related to the starting of the explicabilty_node: [explicability_node-1] def generate_response(self, goal: GenerateResponse.Goal, feedback_cb: Callable = None) -> Tuple[GenerateResponse.Result | GoalStatus]: [explicability_node-1] TypeError: unsupported operand type(s) for |: 'Metaclass_GenerateResponse_Result' and 'Metaclass_GoalStatus' [ERROR] [explicability_node-1]: process has died [pid 3536, exit code 1, cmd '/root/ros2_ws/install/explicability_ros/lib/explicability_ros/explicability_node --ros-args -r __node:=explicability_node'].

It looks like the llama_node starts up though... Steps to reproduce: $ docker run -it --rm --network=host --runtime=nvidia --gpus=all dustynv/ros:humble-desktop-l4t-r35.4.1 $ mkdir -p ~/ros2_ws/src $ cd ~/ros2_ws/src $ git clone --recurse-submodules https://github.com/mgonzs13/llama_ros.git $ pip3 install -r llama_ros/requirements.txt $ git clone --recurse-submodules https://github.com/Dsobh/explainable_ros.git $ cd ~/ros2_ws $ colcon build $ source install/setup.bash $ ros2 launch explicability_bringup explicability_ros.launch.py

I appreciate any help you can provide!

Thanks Again!

Kyle

mgonzs13 commented 2 weeks ago

@kyle-redyeti, which Python version are you using?

kyle-redyeti commented 2 weeks ago

I believe it was 3.8 in the nvidia container. I should be able to build one with 3.10 if that is the issue...

On Tue, Jun 11, 2024, 2:47 AM Miguel Ángel González Santamarta < @.***> wrote:

@kyle-redyeti https://github.com/kyle-redyeti, which Python version are you using?

— Reply to this email directly, view it on GitHub https://github.com/Dsobh/explainable_ros/issues/4#issuecomment-2160027414, or unsubscribe https://github.com/notifications/unsubscribe-auth/AF2TVXCA474UTYCA7ESGNGTZG2TRJAVCNFSM6AAAAABJAHRSWWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRQGAZDONBRGQ . You are receiving this because you were mentioned.Message ID: @.***>

mgonzs13 commented 2 weeks ago

I have just removed that typing format from 3.10. You can try it again if you like. If it still persists, you may edit the code removing the typing from 3.10.

kyle-redyeti commented 2 weeks ago

I will give that a try... If it is better to run in 3.10 I can make the new container. I am sure I will run across more packages that will require 3.10 when adding semantic chunking and semantic router

On Tue, Jun 11, 2024, 6:26 AM Miguel Ángel González Santamarta < @.***> wrote:

I have just removed that typing format from 3.10. You can try it again if you like. If it still persists, you may edit the code removing the typing from 3.10.

— Reply to this email directly, view it on GitHub https://github.com/Dsobh/explainable_ros/issues/4#issuecomment-2160502977, or unsubscribe https://github.com/notifications/unsubscribe-auth/AF2TVXEWQJ7NRD7DH4DY5BLZG3NE7AVCNFSM6AAAAABJAHRSWWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCNRQGUYDEOJXG4 . You are receiving this because you were mentioned.Message ID: @.***>