dusty-nv / rovernet

Deep reinforcement learning-driven robotic controller and navigation library using Torch.
34 stars 21 forks source link

Source code understading :- how to import images? #3

Open yashu88 opened 7 years ago

yashu88 commented 7 years ago

Respected Sir,

I can able to build source code successfully. Thank you very much for source code. It is very useful for detail study for beginner like me. Sir but I have one question about source code.

Sir as per basic principal based on rewards and input images rover robot can perform action. But I don't understand from where and how you import input images in given source code.

I check file rovernet.lua and file video.lua they mention about "image.save('/home/ubuntu/Pictures/rpLIDAR-' .. os.time() .. '.jpg', input_tensor)" and "image.save("/home/ubuntu/test.jpg", img_tensor)". But whether it means they save images in provided folder or read input images from folder.

Sir kindly help me out for detail understanding ad I am beginner but find this topic very interesting and innovative. I want detail study of given source code.Thank you.

dusty-nv commented 7 years ago

Hi yashu88, if you inspect rovernet.lua, and this function declaration function update_network( input_tensor

The sensor data is provided from the user in input_tensor. It could be video, LIDAR, or other sensor data that the user provides. It can be trained on practically any data. A reward is also provided by the user program.

The tensors that are available in the update_network function in rovernet.lua are funneled from the user C/C++ interface that you can find under the C/ directory of the project.

Also note that I am refactoring rovernet to use my jetson-reinforcement repo, so the learners can be tested/verified on test problems and simulation before the real-world robot.

yashu88 commented 7 years ago

Respected Sir,

Thank you very much for your quick and nice reply. Your guidance is so helpful for me.

Now I can restart my study with proper direction. Sir I will also check your jetson-reinforcement github project.

Thank you once again sir.

Thank you and Regards, Yashashree

yashu88 commented 7 years ago

Respected sir,

I have refer your rovernet github source and based on that now I am trying to implement my own code to control mobile robot path. I want to prepare basic program and want to test it on my robot. So I can

clearly understand how DQN work in real environment.

I used Torch/LuaJit for implementation.I have also used ros interface for robot interaction. My receiver program is ROS supportive cpp program and it receives data (webcam output) from camera sensor and extract images. I have feed this images to my agent once preprocessing done. My Agent​ Mobile_Robot.tar.gz https://drive.google.com/file/d/0B6U4I2YxnqS1Tjhua1VGbTdIYTQ/view?usp=drive_web ​ successfully receive this data.

But I got error qlua: ./Agent.lua:245: attempt to index field 'transitions' (a nil value) stack traceback:

Sir I am very new to this field and work by my own. Sir can you please clear my doubt regarding above mentioned error.

Thank you and Regards, Yashashree