Closed Veria70 closed 7 months ago
Hi @Veria70, we are going to review your code, but in any case we will publish a video making the example of a RL Driving Car System this week, based in what you have done, so you can understand what you need to do to make your car to drive itself with Unray.
Thank you so much, I building track for race and complete my Implementation (UE5 & Python) for RL System.
Python Config :
obs_config = {
"space": BridgeSpaces.Box(np.array([np.finfo(np.float32).min,0,0,0,0,0,0,0,0]),
np.array([np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max,
np.finfo(np.float32).max
])),
"description": "General coordinates of the cartpole"
}
act_config = {
"space": BridgeSpaces.Box(np.array([-1,-1,0]),np.array([1,1,1])), #Throttle/Steering/Brake
"description": "General coordinates of the cartpole"
}
Best Regards.
another Request : Give us Help to save model and load model Thanks.
Hi Veria, we are still working in the video tutorial, in the meantime, this is the documentation on how to save a checkpoint with RLlib: https://docs.ray.io/en/latest/rllib/rllib-saving-and-loading-algos-and-policies.html#how-do-i-create-an-algorithm-checkpoint
And this is how you restore the model checkpoint from your disk drive https://docs.ray.io/en/latest/rllib/rllib-saving-and-loading-algos-and-policies.html#how-do-i-restore-an-algorithm-from-a-checkpoint
Hi! I wanted to check if you were able to use the trace channel in the observation system.
Hi @vahernandezmo @GDiaz16 Do you Publish Tutorial Video About RL Car Racing or Something Similar? I Really Need Practical Tutorial as Document Or Video. Thank You Very Much. Best Regards.
HI @Veria70 We wanted to ask what do you need a tutorial on? Is it any specific module inside the plugin? Is it something in the python side of things? Please let us know so we can start working on the tutorial Thanks
Hi @vahernandezmo I need Document about plugin API and Game specific Example (not Cartpole), you wrote plugin for game engine please give me Game Specified Example with step by step setup, give me reference in RLlib with document. You spent time on writing a software to implement artificial intelligence in the game in order to make it easier for everyone, but without information and examples corresponding to the game, this process will not be easy, it will be more difficult. Thanks
Hi @Veria70, we are working on a tutorial from 0 to a functional agent car model, step by step. We expect it to be ready next week.
We will let you know when it is uploaded in Youtube.
Hi @GDiaz16 Thank You Very Much, I'm waiting for your tutorial.
Hi @Veria70, this is an update of the tutorial we are working on, the car is already training to complete the track, let me know if you want something else to be shown in the tutorial. https://github.com/Nullspace-Colombia/unray-bridge/assets/16580160/85bbb0a5-75a2-4da3-be9f-f3ac76437bc1
Hi @GDiaz16 It is wonderful, I would be grateful if the method of model inference is included in your tutorial. Thank you for your hard work.
Hi @Veria70, just to let you know that the tutorial you requested has just been uploaded, you can go and check it out! Sorry for the delay tho. https://youtu.be/O3azZNGmeGo?si=24KDS4sBS6hqF3VU
Thank you so much @GDiaz16
Hi, I want to create RL Based Driving Car System, i used Cartpole BP and set everything but i think something was missed in my Implementation, how to define minus reward or using trace channel to Observation system??!! I have RL based car and Goal actor in my scene, i used distance between car and goal actor and connected value to Observation function. Thank You for your help Best Regards.