██████╗ █████╗ ███████╗███████╗ ██████╗████████╗██╗
██╔════╝ ██╔══██╗╚══███╔╝██╔════╝██╔════╝╚══██╔══╝██║
██║ ███╗███████║ ███╔╝ █████╗ ██║ ██║ ██║
██║ ██║██╔══██║ ███╔╝ ██╔══╝ ██║ ██║ ██║
╚██████╔╝██║ ██║███████╗███████╗╚██████╗ ██║ ███████╗
╚═════╝ ╚═╝ ╚═╝╚══════╝╚══════╝ ╚═════╝ ╚═╝ ╚══════╝
Head tracking display switcher for macOS
gazectl can track your head with either your webcam or supported AirPods motion sensors to detect which monitor you're looking at and automatically switch focus to it. It uses Apple's Vision and Core Motion frameworks plus native macOS APIs to switch monitor focus, with no third-party window manager required.
macOS only. Requires macOS 14+.
gazectl needs these macOS permissions depending on the source you use:
- Accessibility — for moving the cursor and clicking to switch monitor focus
- Camera — required for
--source camera - Motion & Fitness — required for
--source airpods
Grant the needed permissions in System Settings → Privacy & Security. macOS prompts the first time a source is used.
npx gazectl@latestOr install globally:
npm i -g gazectl# First run — calibrates automatically
gazectl
# Use AirPods motion instead of the camera
gazectl --source airpods
# Force recalibration
gazectl --calibrate
# With verbose logging
gazectl --verboseOn first run, gazectl asks you to look at each monitor and press Enter. It samples your head pose for 2 seconds per monitor, then saves calibration to ~/.local/share/gazectl/calibration.json.
AirPods mode uses a session-relative baseline. After the initial per-monitor calibration, later launches ask for one short anchor baseline on the saved anchor monitor before live tracking begins. See docs/airpods-motion.md for the model.
| Flag | Default | Description |
|---|---|---|
--calibrate |
off | Force recalibration |
--calibration-file |
~/.local/share/gazectl/calibration.json |
Custom calibration path |
--source |
camera |
Tracking source: camera or airpods |
--camera |
0 | Camera index, only valid with --source camera |
--verbose |
off | Print live pose continuously |
--debug |
off | Print transition decision points |
- Calibrate — record a per-monitor pose map for the selected source
- Track — Apple Vision or AirPods motion reports your live head pose in real time
- Switch — gazectl matches the live pose against the calibrated monitor poses and clicks into the best match
swift build -c release
cp .build/release/gazectl /usr/local/bin/gazectl