This repository contains the runtime used to deploy the motion-tracking policy in:
sim2sim: MuJoCo simulator + high-level controllersim2real: Unitree G1 + high-level controller
This environment is only for running the policy side.
Live teleoperation runs in a separate environment. See teleop/README.md.
This README is organized as follows:
uvsetup for the policy/runtime environment- how to run
sim2sim - how to run
sim2real - what the UDP motion selector is and how to use it
- what the VR motion source is and how to use it
Use uv for this repository. Do not create a separate Conda environment for the policy runtime.
cd /path/to/sim2real
uv syncThat creates the local .venv from pyproject.toml and uv.lock.
Run all scripts through uv run:
uv run src/sim2sim.py --xml_path assets/g1/g1.xml
uv run src/deploy.py --net lo --sim2sim
uv run src/motion_select.pysim2sim uses two processes:
- the simulator / low-level state publisher
- the high-level controller
Start the simulator first:
cd /path/to/sim2real
uv run src/sim2sim.py --xml_path assets/g1/g1.xmlThen start the controller in another terminal:
cd /path/to/sim2real
uv run src/deploy.py --net lo --sim2simFlow:
- keep the simulator window focused so the simulated remote input works
- press
sin the simulator to leave zero-torque / move to default pose - once the robot is in the default pose, press
ain the simulator to start the tracking policy - press
xto exit
If your motion source is udp, also run the motion selector in a third terminal.
If your motion source is vr, also run the teleop bridge from teleop/README.md.
sim2real only starts the high-level controller in this repository. The robot hardware side is the real Unitree platform.
Before running:
- power on G1
- connect your PC to the robot over Ethernet
- configure the correct network interface on your PC
- make sure you know the interface name you want to pass as
--net
Start the controller:
cd /path/to/sim2real
uv run src/deploy.py --net <robot_iface> --realFlow:
- controller starts in zero-torque mode and waits for the remote
startbutton - press
starton the Unitree remote to move the robot to the default pose - place or confirm the robot is safely on the ground
- press
Aon the Unitree remote to enter the tracking policy - press
selecton the Unitree remote to exit
Always test a motion in sim2sim before running it on the real robot.
The tracking policy can consume two kinds of motion sources:
udpvr
This is configured in config/tracking.yaml with:
motion_source: "vr"The current default in this repository is vr.
The UDP motion selector is the offline motion-switching interface.
In this mode, the controller does not consume live teleop data. Instead, it plays motions listed in config/tracking.yaml, and you choose which motion to append through a small UDP command tool.
Internally:
deploy.pycreates aUDPMotionSourceUDPMotionSourcestarts a tiny UDP servermotion_select.pysends motion names to that UDP server
First change the tracking config:
motion_source: "udp"Then run the normal controller flow:
sim2sim:uv run src/sim2sim.py --xml_path assets/g1/g1.xml uv run src/deploy.py --net lo --sim2sim
sim2real:uv run src/deploy.py --net <robot_iface> --real
Then run the selector in another terminal:
cd /path/to/sim2real
uv run src/motion_select.pyUsage:
- type a motion index or motion name and press Enter
- type
listto print all available motions - press Enter on an empty line to resend the previous choice
- type
rto reloadconfig/tracking.yaml - type
qto quit
Behavior:
defaultreturns the policy toward the idle/default pose- non-default motions are taken from the
motions:list inconfig/tracking.yaml - switching is append-based rather than an immediate hard cut
- switching follows the policy-side gating logic:
- from
default, you can switch to any motion - once a non-default motion is active, you cannot jump directly to another non-default motion
- a non-default motion must finish first
- after it finishes, you can switch back to
default - only after returning to
defaultcan you switch to a different motion
- from
The VR motion source is the live teleoperation interface.
In this mode, sim2sim/sim2real does not receive motion names over UDP. Instead, it requests pose chunks from the teleop bridge over ZMQ.
Internally:
deploy.pycreates aVRMotionSourceVRMotionSourceconnects to the teleop bridge on the ZMQ addresses inconfig/tracking.yamlVRMotionSourcemaintains the reference-motion buffer- when the future horizon drops below the low-water mark, it requests more frames
- the teleop bridge retargets the latest XR/PICO stream and returns a chunk of frames
The teleop bridge itself is documented in teleop/README.md.
Leave the config as:
motion_source: "vr"Start the teleop bridge in its own environment by following teleop/README.md.
Then start the controller here as usual:
sim2sim:uv run src/sim2sim.py --xml_path assets/g1/g1.xml uv run src/deploy.py --net lo --sim2sim
sim2real:uv run src/deploy.py --net <robot_iface> --real
There are two layers of control in VR mode.
Robot-side controller state:
- simulated remote in
sim2sim:sto move to default pose,ato start tracking - Unitree remote in
sim2real:startto move to default pose,Ato start tracking
Live teleop control from the XR/PICO side:
- right-hand
A: start/resume live teleop streaming - left-hand
X: pause live teleop streaming
The XR/PICO installation and teleop-side runtime are intentionally kept out of this uv environment. Use the separate setup in teleop/README.md.