Skip to content

AndreasInk/gazectl

 
 

Repository files navigation

 ██████╗  █████╗ ███████╗███████╗ ██████╗████████╗██╗
██╔════╝ ██╔══██╗╚══███╔╝██╔════╝██╔════╝╚══██╔══╝██║
██║  ███╗███████║  ███╔╝ █████╗  ██║        ██║   ██║
██║   ██║██╔══██║ ███╔╝  ██╔══╝  ██║        ██║   ██║
     ╚██████╔╝██║  ██║███████╗███████╗╚██████╗   ██║   ███████╗
      ╚═════╝ ╚═╝  ╚═╝╚══════╝╚══════╝ ╚═════╝   ╚═╝   ╚══════╝

Head tracking display switcher for macOS



gazectl can track your head with either your webcam or supported AirPods motion sensors to detect which monitor you're looking at and automatically switch focus to it. It uses Apple's Vision and Core Motion frameworks plus native macOS APIs to switch monitor focus, with no third-party window manager required.

macOS only. Requires macOS 14+.

Permissions

gazectl needs these macOS permissions depending on the source you use:

  • Accessibility — for moving the cursor and clicking to switch monitor focus
  • Camera — required for --source camera
  • Motion & Fitness — required for --source airpods

Grant the needed permissions in System Settings → Privacy & Security. macOS prompts the first time a source is used.

Install

npx gazectl@latest

Or install globally:

npm i -g gazectl

Usage

# First run — calibrates automatically
gazectl

# Use AirPods motion instead of the camera
gazectl --source airpods

# Force recalibration
gazectl --calibrate

# With verbose logging
gazectl --verbose

On first run, gazectl asks you to look at each monitor and press Enter. It samples your head pose for 2 seconds per monitor, then saves calibration to ~/.local/share/gazectl/calibration.json.

AirPods mode uses a session-relative baseline. After the initial per-monitor calibration, later launches ask for one short anchor baseline on the saved anchor monitor before live tracking begins. See docs/airpods-motion.md for the model.

Options

Flag Default Description
--calibrate off Force recalibration
--calibration-file ~/.local/share/gazectl/calibration.json Custom calibration path
--source camera Tracking source: camera or airpods
--camera 0 Camera index, only valid with --source camera
--verbose off Print live pose continuously
--debug off Print transition decision points

How it works

  1. Calibrate — record a per-monitor pose map for the selected source
  2. Track — Apple Vision or AirPods motion reports your live head pose in real time
  3. Switch — gazectl matches the live pose against the calibrated monitor poses and clicks into the best match

Build from source

swift build -c release
cp .build/release/gazectl /usr/local/bin/gazectl

Star History

GitHub Star History for jnsahaj/gazectl

About

Head tracking display focus switcher for macOS

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Swift 97.3%
  • Shell 2.7%