Closed strategist922 closed 1 year ago
This project is only about ROS support on Pepper. So whatever you can do with ROS, in theory, you should be able to do it with Pepper. In practice there are solutions for navigation and SLAM with Nav2. ROS 2 is in the works for NAO, and it should not be too hard to bring it to Pepper right after. But there's no official announcement. This group is community-driven, if driven at all.
Vision, AI, ML and LLMs exist out there and you should be able to use them on top of a NAOqi or a ROS 2 client. But you will have to tinker. I think there are also projects by United Robotics Group for the academic clients to provide a Python Notebook and AI libraries in Pepper /w NAOqi 2.5.
If you need to do all of these features inside the robot, it's tougher. Personally I'd upgrade my Pepper to 2.9 and use the official localization and navigation stack (from the Qi SDK).
Hi, I have a real pepper with NAOqi OS Ver. 2.5.5.5, I want to let my pepper can walk in our office freely and safely with its sensor, and can identify each people it saw then have some conversation(Traditional Chinese and Englist)with this people by using deep learning framework such as Tensorflow, CNTK,...and so on.
I ever try choregraphe-suite-2.5.5.5 under windows 10 X64 but it seems choregraphe cannot implement these kind of complex task, so I want to implement these by Python.
I saw this simulation tool in https://github.com/ros-naoqi/ , but there is no more other info about developing Pepper robot by leveraging AI or ML framework that mentioned above, do you have plan to add this kind of function to this project?