YAHMP is a humanoid general motion tracking policy for the Unitree G1 robot.
The training pipeline builds upon mjlab.
g1-yahmp.mp4
First, install uv:
curl -LsSf https://astral.sh/uv/install.sh | shThen install the project dependencies with:
uv syncFinally, verify that the YAHMP environments are correctly installed and visible with:
uv run list_envs | rg YAHMPDownload the retargeted OMOMO and AMASS motions used for training from this link, extract the archive, and copy the g1_omomo_amass_clean folder into assets/motions/.
The default motion configuration src/yahmp/config/g1/motion_data_cfg.yaml specifies which motions are loaded during training—modify it if you want to use different reference datasets.
Smoke-test the teacher scene with a zero-action agent:
uv run play Mjlab-YAHMP-Unitree-G1 --agent zerouv run train Mjlab-YAHMP-Unitree-G1 --env.scene.num-envs 8192uv run python -m yahmp.scripts.deploy.run_yahmp_onnx_mujoco \
--task-id Mjlab-YAHMP-Unitree-G1 \
--onnx-path assets/models/g1_yahmp.onnx \
--motion-file assets/motions/g1_omomo_amass_clean/<motion-name>.npzUtility scripts for data conversion, ONNX export, deployment, evaluation, and workflow helpers live under src/yahmp/scripts. See the dedicated guide at src/yahmp/scripts/README.md.
In addition to Mjlab-YAHMP-Unitree-G1, this project also includes environments that together form an example Teacher–Student training pipeline:
Mjlab-YAHMP-Teacher-Unitree-G1: privileged teacher exampleMjlab-YAHMP-Student-RL+Action-Matching-Unitree-G1: student trained via RL + Action-Matching distillationMjlab-YAHMP-Student-RL+KL-Matching-Unitree-G1: student trained via RL + KL-Matching distillation
Install the dev tools with:
uv sync --group devFormat and auto-fix the repo with:
make format