Skip to content

Interactions-HSG/MagicGaze

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Logo

MagicGaze

Enabling Seamless Control of IoT Devices Through Eye Tracking

animated

This repository contains the code for the ETRA 2026 Workshop paper:

Magic Gaze: Enabling Seamless Control of IoT Devices Through Eye Tracking Kenan Bektaş, Tobias Ettling, Simon Mayer, and Jannis Strecker-Bischoff. 2026. In 2026 Symposium on Eye Tracking Research and Applications (ETRA ’26), June 01–04, 2026, Marrakesh, Morocco. ACM, New York, NY, USA, 8 pages. https://doi.org/10.1145/3797246.3804834

About The Project

This repository contains a collection of plugins for Pupil Capture. The plugins are written in Python and can be used to extend the functionality of the Pupil Capture software. We have developed a set of plugins that enable gaze-based control of IoT devices in the network. The plugins are integrated with the Pupil Core Network API.

Plugins

Currently the following plugins are available:

  • Head Gesture Detection: Detect head gestures based on triggers. Head gestures are head movements while fixating on a object. A gesture is completed when the user moves his head away and then back to the defined origin (norm_pos) in the trigger event. Currently, the trigger is a fixation event on a object, therefor the plugin is dependent on the object detection plugin, but can be switched in code.

  • Fixation Object Detection: Predicts the object that the user is fixating on using a pre-trained object detection model (ultralytics - yolo models).

  • Blink Gesture Detection: Detects intentional blinks and associates them with objects in the user's field of view. The plugin distinguishes active (intentional) blinks from passive ones by analyzing the blink duration and gaze context. If a blink event is detected while the user is fixating on an object, it is recorded and can be used as a control signal for interacting with smart devices.

  • Saccade Gesture Detection: Detects saccadic movements and associates them with objects in the user's view. A saccade is registered when the user's gaze quickly moves away from a fixated object and then returns within a short timeframe. The plugin determines the direction of the saccadic movement (e.g., left, right, up, or down) and assigns it to the interacted object.

Getting Started

To get a local copy up and running follow these simple steps.

Prerequisites

In order to integrate the plugins with Pupil Capture, you need to install the Pupil software from source and create a dedicated virtual environment for the pupil software.

To run the plugins you need to install the required dependencies in the virtual environment.

Adding the Plugins to Pupil Capture

Pupil labs has a detailed guide on how to add plugins to Pupil Capture. The plugins can be added by copying the plugin file to the capture_settings/plugins directory located in the Pupil Capture installation directory.

  1. Copy the plugin python file to the capture_settings/plugins (Arch Linux Installation) directory. Make sure to specify the correct path to the Pupil Capture installation directory.
  cp /path/to/plugin.py /path/to/pupil-capture/capture_settings/plugins
  1. Install the required dependencies in the virtual environment.
  source /path/to/pupil-env/bin/activate
  pip install -r requirements.txt
  1. Start Pupil Capture and enable the plugin in the plugin manager.
  2. The plugin should now be available in the Pupil Capture GUI.

Usage of the Network API

Explore the example notebooks in the examples directory to learn how to interact with the Network API.

Contributing

(Feel free to add contribution guidelines here)

✉️ Contact

If you have questions about this research, feel free to contact Kenan Bektaş: kenan.bektas@unisg.ch.

This research has been done by the group of Interaction- and Communication-based Systems (interactions.ics.unisg.ch) at the University of St.Gallen (unisg.ch).

📑 License

The code in this repository is licensed under the Apache License 2.0 (see LICENSE) if not stated differently in the individual files and folders.

About

MagicGaze - Enabling Seamless Control of IoT Devices Through Eye Tracking

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages