An eye-tracking text input system based on the Dasher algorithm, built in Unity for Meta Quest VR headsets.
- Unity Editor:
2022.3.62f3 - Git (for cloning and version control)
- Internet connection on first open so Unity can restore packages from
Packages/manifest.json
The exact Unity version is recorded in ProjectSettings/ProjectVersion.txt.
Using the same 2022.3 LTS line is usually fine, but the safest option is the exact editor version above.
Dasher is a text input method where you don't type — you steer through characters using your gaze (or mouse). Characters are arranged vertically, sized by their probability of being next. You "zoom" right to select, left to undo. A language model predicts likely next characters and makes them bigger = easier to select.
- Open Unity Hub → Add project from disk → select this
VrDasher2/folder - Use Unity
2022.3.62f3if possible - Let Unity restore packages and import the project fully on first open
- Create a new Scene (
File → New Scene) - Create an empty GameObject → Name it
DasherSystem - Add Component → search for
DasherSceneSetup→ attach it - Press Play
- Move your mouse:
- Right → zoom into characters (select)
- Left → zoom out (undo)
- Up/Down → choose which character
- Backspace → clear all text
- Space → pause/resume
Assets/Scripts/
├── Core/
│ ├── DasherEngine.cs # Core zooming algorithm (OneStepTowards)
│ ├── DasherNode.cs # Tree node with bounds & colors
│ ├── DasherRenderer.cs # UI rendering with object pooling
│ └── LanguageModel.cs # PPM order-5 prediction model
├── Input/
│ ├── IDasherInput.cs # Input interface
│ ├── MouseInput.cs # Mouse (Editor testing)
│ ├── HeadGazeInput.cs # Head gaze (Quest 2/3)
│ ├── EyeTrackingInput.cs # Eye tracking (Quest Pro)
│ └── GazeCursor.cs # Visual gaze cursor
├── DasherController.cs # Main controller (auto-detects input)
└── DasherSceneSetup.cs # One-click scene builder
- Language Model (PPM order-5) predicts next character probabilities from training text
- Character boxes are sized proportionally to their probability
- Pointer position controls zoom: right = forward (write), left = reverse (undo)
- The zoom algorithm (
OneStepTowards) maps target range[Y-X, Y+X]to fill[0, MAX_Y=4096] - After a character is selected (node covers crosshair), predictions update based on new context
- Common words like "the" require minimal effort — 'e' after "th" takes ~48% of the screen
| Mode | Device | How It Works |
|---|---|---|
| Mouse | Unity Editor | Mouse position on screen → canvas position |
| Head Gaze | Quest 2/3 | Camera forward direction → raycast onto canvas |
| Eye Tracking | Quest Pro | OVREyeGaze transform → raycast onto canvas |
The system auto-detects the best available input — no configuration needed.
This repository should include:
Assets/Packages/manifest.jsonPackages/packages-lock.jsonProjectSettings/
This repository should not include:
Library/Temp/Obj/Logs/UserSettings/- local IDE files
Unity will automatically restore package dependencies from Packages/manifest.json and Packages/packages-lock.json when someone opens the project.
That means people usually do not manually download normal Unity packages before opening.
One exception here is Meta XR Core SDK:
- It is optional for basic editor testing and head-gaze fallback
- It is needed if someone wants full Meta-specific VR or Quest Pro eye-tracking setup in the headset
- The current code compiles without it because the eye-tracking path uses reflection and fallback behavior
- Best option: open with
2022.3.62f3 - Usually okay: another
2022.3.xLTS version - Riskier:
2021,2023, or6000.xbecause Unity may upgrade project files or package behavior
If someone opens the project in a different major or minor Unity version, Unity may:
- reserialize project settings
- upgrade package versions
- change generated files
- make the project harder to reopen in the old version
For that reason, it is a good idea to mention the required Unity version clearly in the GitHub repo.
- Install Meta XR Core SDK from Unity Asset Store
- Add
OVRCameraRigprefab to the scene - The
HeadGazeInputauto-detects VR mode — works out of the box
- Install Meta XR Core SDK via Package Manager
- Add
OVRCameraRigprefab to scene - On
OVRManager, enable Eye Tracking under Quest Features → General - Create two empty GameObjects → Add
OVREyeGazecomponent to each- Set one to
Eye.Left, the other toEye.Right
- Set one to
- In
DasherSceneSetupor manually: assign the eye gaze objects toEyeTrackingInput - On the headset: Settings → Movement Settings → Eye Tracking → Enable + Calibrate
File → Build Settings → Android→ Switch Platform- Ensure
XR Plug-in Management → Oculusis checked - Player Settings → Min API Level = 29, Target = 32
- Build and Run (connect Quest via USB or Oculus Link)