LeRobot documentation

Unitree G1

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Unitree G1

Unitree G1 locomanipulation demo

The Unitree G1 humanoid is now supported in LeRobot! You can teleoperate, train locomanipulation policies, test in sim, and more. Both 29 and 23 DoF variants are supported.


Part 1: Getting Started

Install the Unitree SDK

Follow the unitree_sdk2_python installation guide. Tested with unitree_sdk2py==1.0.1 and cyclonedds==0.10.2:

conda create -y -n lerobot python=3.12
conda activate lerobot
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
cd unitree_sdk2_python
pip install -e .
cd ..

Install LeRobot

conda install ffmpeg -c conda-forge
conda install -c conda-forge "pinocchio>=3.0.0,<4.0.0"
git clone https://github.com/huggingface/lerobot.git
cd lerobot
pip install -e '.[unitree_g1]'
For now, pinocchio must be installed from conda-forge (not pip) to include the CasADi bindings needed for arm IK.

Test the Installation (Simulation)

The simulation environment has its own dependencies. Check the Simulation environment dependencies: Unitree G1 Mujoco EnvHub.

pip install mujoco loguru msgpack msgpack-numpy
lerobot-teleoperate \
  --robot.type=unitree_g1 \
  --robot.is_simulation=true \
  --teleop.type=unitree_g1 \
  --teleop.id=wbc_unitree \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30, "warmup_s": 5}}' \
  --display_data=true \
  --robot.controller=GrootLocomotionController

This will launch a MuJoCo sim instance for the G1. You can connect a gamepad to your machine before launching in order to control the robot’s locomotion in sim. We support both HolosomaLocomotionController and GrootLocomotionController via --robot.controller.

  • Press 9 to release the robot
  • Press 7 / 8 to increase / decrease waist height

Connect to the Physical Robot

The G1’s Ethernet IP is fixed at 192.168.123.164. Your machine must have a static IP on the same subnet: 192.168.123.x where x ≠ 164.

# Replace 'enp131s0' with your ethernet interface name (check with `ip a`)
sudo ip addr flush dev enp131s0
sudo ip addr add 192.168.123.200/24 dev enp131s0
sudo ip link set enp131s0 up

SSH into the Robot

ssh unitree@192.168.123.164
# Password: 123

Share Internet via Ethernet

The G1 needs internet access to clone repos and install packages. Share your laptop’s connection over Ethernet:

On your laptop:

sudo sysctl -w net.ipv4.ip_forward=1

# Replace wlp132s0f0 with your WiFi interface name
sudo iptables -t nat -A POSTROUTING -o wlp132s0f0 -s 192.168.123.0/24 -j MASQUERADE
sudo iptables -A FORWARD -i wlp132s0f0 -o enp131s0 -m state --state RELATED,ESTABLISHED -j ACCEPT
sudo iptables -A FORWARD -i enp131s0 -o wlp132s0f0 -j ACCEPT

On the G1:

sudo ip route del default 2>/dev/null || true
sudo ip route add default via 192.168.123.200 dev eth0
echo "nameserver 8.8.8.8" | sudo tee /etc/resolv.conf

# Verify
ping -c 3 8.8.8.8

Install the Unitree SDK on the G1

Follow the unitree_sdk2_python installation guide:

conda create -y -n lerobot python=3.12
conda activate lerobot
git clone https://github.com/unitreerobotics/unitree_sdk2_python.git
cd unitree_sdk2_python
python -m pip install -e .
cd ..

Install LeRobot on the G1

git clone https://github.com/huggingface/lerobot.git
cd lerobot
conda install -c conda-forge "pinocchio>=3.0.0,<4.0.0"
python -m pip install -e '.[unitree_g1]'
For now, pinocchio must be installed from conda-forge (not pip) to include the CasADi bindings needed for arm IK.

(Optional) Enable WiFi on the Robot

For wireless SSH access, you can enable WiFi on the G1 (it’s blocked by default):

sudo rfkill unblock all
sudo ip link set wlan0 up
sudo nmcli radio wifi on
sudo nmcli device set wlan0 managed yes
sudo systemctl restart NetworkManager

Connect to a WiFi network:

nmcli device wifi list

sudo nmcli connection add type wifi ifname wlan0 con-name "YourNetwork" ssid "YourNetwork"
sudo nmcli connection modify "YourNetwork" wifi-sec.key-mgmt wpa-psk
sudo nmcli connection modify "YourNetwork" wifi-sec.psk "YourPassword"
sudo nmcli connection modify "YourNetwork" connection.autoconnect yes
sudo nmcli connection up "YourNetwork"

ip a show wlan0

You can then SSH over WiFi instead of Ethernet:

ssh unitree@<ROBOT_WIFI_IP>
# Password: 123

Part 2: Teleoperation & Locomotion

Run the Robot Server

On the robot (from ~/lerobot):

cd ~/lerobot
python src/lerobot/robots/unitree_g1/run_g1_server.py --camera

Run the Locomotion Policy

You can run the teleoperation client from your laptop over Ethernet, over WiFi (experimental), or directly on the robot itself. Mind potential latency introduced by your network.

From your laptop:

lerobot-teleoperate \
  --robot.type=unitree_g1 \
  --robot.is_simulation=false \
  --robot.robot_ip=<ROBOT_IP> \
  --teleop.type=unitree_g1 \
  --teleop.id=wbc_unitree \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "<ROBOT_IP>", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
  --display_data=true \
  --robot.controller=HolosomaLocomotionController

We support both GrootLocomotionController and HolosomaLocomotionController via --robot.controller.


Part 3: Loco-Manipulation with the Homunculus Exoskeleton

We provide a loco-manipulation solution via the Homunculus Exoskeleton — an open-source 7 DoF exoskeleton for whole-body control. Check it out here.

Calibrate

lerobot-calibrate \
  --teleop.type=unitree_g1 \
  --teleop.left_arm_config.port=/dev/ttyACM1 \
  --teleop.right_arm_config.port=/dev/ttyACM0 \
  --teleop.id=exo

During calibration move each joint through its entire range. After fitting, move the joint in a neutral position and press n to advance.

Record a Dataset

lerobot-record \
  --robot.type=unitree_g1 \
  --robot.is_simulation=true \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "localhost", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
  --teleop.type=unitree_g1 \
  --teleop.left_arm_config.port=/dev/ttyACM1 \
  --teleop.right_arm_config.port=/dev/ttyACM0 \
  --teleop.id=exo \
  --dataset.repo_id=your-username/dataset-name \
  --dataset.single_task="Test" \
  --dataset.num_episodes=2 \
  --dataset.episode_time_s=5 \
  --dataset.reset_time_s=5 \
  --dataset.push_to_hub=true \
  --dataset.streaming_encoding=true \
  --dataset.encoder_threads=2

Note: Omit --teleop.left_arm_config.port and --teleop.right_arm_config.port if you’re only using the joystick.

Example dataset: nepyope/unitree_box_move_blue_full


Part 4: Training & Inference

Train

python src/lerobot/scripts/lerobot_train.py \
  --dataset.repo_id=your-username/dataset-name  \
  --policy.type=pi05 \
  --output_dir=./outputs/pi05_training \
  --job_name=pi05_training \
  --policy.repo_id=your-username/your-repo-id \
  --policy.pretrained_path=lerobot/pi05_base \
  --policy.compile_model=true \
  --policy.gradient_checkpointing=true \
  --wandb.enable=true \
  --policy.dtype=bfloat16 \
  --policy.freeze_vision_encoder=false \
  --policy.train_expert_only=false \
  --steps=3000 \
  --policy.device=cuda \
  --batch_size=32

Inference with RTC

Once trained, we recommend deploying policies using inference-time RTC:

python examples/rtc/eval_with_real_robot.py \
  --policy.path=your-username/your-repo-id \
  --policy.device=cuda \
  --robot.type=unitree_g1 \
  --robot.is_simulation=false \
  --robot.controller=HolosomaLocomotionController \
  --robot.cameras='{"global_view": {"type": "zmq", "server_address": "<ROBOT_IP>", "port": 5555, "camera_name": "head_camera", "width": 640, "height": 480, "fps": 30}}' \
  --task="task_description" \
  --duration=1000 \
  --fps=30 \
  --rtc.enabled=true

Additional Resources


Last updated: March 2026

Update on GitHub