OpenLCH is an open-source ultra-low-cost humanoid robot designed for experimenting with machine learning methods for robot control.
[!WARNING] The project is currently in progress (V0.3.1)
We designed our robot in OnShape, trained the PPO model in Isaac Gym, and are now transfering it onto the physical robot. Check out our public roadmap for updates here.
Our goal is to build and deploy a large amount (20-30) of small humanoid robots to the physical world and create an affordable open-source platform for humanoid research and competitions. The robot design is inspired by Robotis OP3, while the initative is inspired by Alex Koch's robot arms.
We made the first version of this humanoid robot at a hackathon in 24 hours on 2024/08/31.
[!NOTE]
Interested in updates, contributing, or building your own mini humanoid? Let us know through our interset form here: https://forms.gle/AvDzMEFUYeVNtFvj6!
Specifications: | Height | Weight | DoF |
---|---|---|---|
50cm | 15lb | 16 (5 DoF per leg, 3 DoF per arm) |
URDF/MJCF: https://kscale.store/file/5b9b5eecb7ffcab1
Part | Description | Link | Quantity | Total Cost (USD) | Date Decided |
---|---|---|---|---|---|
Serial BusServos | STS3215 30KG Serial Bus Servo | Link | x16 | 222.24 | 9/10 |
Controller | Milk-V | Link | x1 | 10 | TBD |
Servo driver board | Waveshare Bus Servo Adapter | Link | x1 | 7 | 10/05 |
IMU | RP2040 MCU Board | Link | x1 | 16 | 10/03 |
Camera | Milk-V CAM-GC2083 | Link | x1 | 4 | 10/05 |
Battery | RC Lipos | Link | x1 | 33 | Proposed |
12V to 5V | 12V to 5V, 3 amp capacity (may need connectors) | Link | x1 | 3 | 9/24 |
Microphone | TBD | x1 | |||
Molex servo connectors | 5264-3p compatible connectors (or extending servo cables provided with servos) | Link | x1 | 25 | 10/05 |
WIP, coming soon...
We're currently 3D printing and testing the hardware.
WIP, coming soon...
We're currently writing our firmware for sensors and controllers.
WIP, coming soon...
We're currently working on robot control that will be written in Rust for performance and safety (and enjoyment). :)
We're using the K-Scale simulation library (based on Isaac Gym) to simulate and training our robot.
Left: URDF Model, Right: Isaac Gym Training
We use NVIDIA Isaac Gym to simulate, train, and test the robot for locomotion based on the K-Scale simulation library.
Link: https://github.com/jingxiangmo/sim/tree/master Docs: https://docs.kscale.dev/software/simulation/isaac
To test policy in MuJoCo, you can use the following command:
export MODEL_DIR=sim/resources
python sim/sim2sim.py --load_model examples/standing.pt --embodiment stompymicro
Currently the URDF model also support PyBullet using K-Scale OnShape library: https://docs.kscale.dev/software/onshape
We use PPO to train the robot to stand and walk. The training is done in Isaac Gym with the K-Scale simulation and training library: https://github.com/jingxiangmo/sim/tree/master.
Integration of E-VLA will be in V2. For more details, please refer to the E-VLA documentation.
Integration of K-Lang will be in V2. For more details, please refer to the K-Lang documentation.
This project is licensed under the MIT License.
Core contributors:
Open Source Acknowledgment:
Last updated: 2024/09/27