I built this self-hosted server template to help you focus on what truly matters in your app—like algorithms and core functionality—while handling user management and setup for you.
Initially, this was a personal solution. I often found that user management and database configuration were extra hassles that slowed down my development process. I wanted to solve this once and for all, so I could stay focused on my ideas.
If this sounds interesting, feel free to check out the website!
Hey all, wanted to check is it only me or eye-tracking market is bit broken.
There are basically two types of tech here: hardware eye tracking (which goes into wearable or stationary tracking) and software tracking (I would say less popular).
Both are incredibly expensive and mostly focused on research, basically or you have to get expensive hardware, or you have to get expensive license for your software. And this is I guess fair in small research market, but devices like vision pro kinda shows that eye tracking, gaze aware electronics can be super comfortable and native to use for us.
Also windows for example supports natively eye tracking for dealing with disabilities (and these disabilities are ones all of us can get, by just simply skiing downhill and breaking arms, doing other type of extreme sports). But to use it you need to buy compatible hardware tracker.
I think it is bit broken, that disabled people (and technically all of us are) kinda locked out of eye tracking technology for causal use. Like you ask me: we have keyboards and touch screens so there is no reason for any eye tracking, but then we have each year flu season spreading because people touch for example same touch screens while using restaurant kiosks (like the ones you can find in mcdonalds etc), this could be done with gaze tracking and vision pro like gestures.
Or these are my thoughts, which pushed me on path to democratize bit webcam driven eye tracking tech. Like usually biggest problem for building new apps is lack or locked technology engineers cannot access and need to develop for themselves. Such step require know-how and time. So I have decided to bring open source algorithm for people to tinker: https://github.com/NativeSensors/EyeGestures and https://polar.sh/NativeSensors
It basically changes your laptop or any webcam into eye tracker (maybe not as precise as hardware ones, but still)! And well I think this could bring a bit of freshness into user interfaces.
What are your thoughts? Is that even viable idea, or rather something what will never get traction and I am extremely wrong?
There are invasive and non-invasive BCI capabilities, and their ranges vary. E.g. Neural Lace is a surgically invasive BCI capability. A dry electrode EEG cap is a noninvasive BCI sensor.
If a noninvasive BCI capability can write to a brain, is it a Directed Energy Weapon? And so I suspect the applications maybe limited. Certain states don't even allow video biometrics in banks, casinos, or prisons FWIU.
IDK much about eye tracking.
I know that there exist patients with ocular movement disorders like amblyopia/strabismus/exotropia and nystagmus. The equitability of eye-tracking only interfaces is limited if they do not work with eye movement disorders, which are more common than not having at least one hand.
If the system must detect which eye movement disorder a patient has in order to work, is it 1) doing automated diagnosis, which is unlikely to have 100% accuracy; and 2) handling sensitive PHI Personal Health Information without consent, and 3) capturing biometric information without consent.
I understand that (when people have consented), eye tracking experiments yield helpful UX design research data like UI heatmaps that can help developers make apps more accessible.
Practically, how do you turn off the eye tracking component and use the app without the fancy gestural HID; and if one is suddenly distracted by something else in the FOV, is there unintentional input?
Community response to a benign opt-in art project that merges eye tracking data from viewers might be ethically studied in estimating viability of a competing BCI HID approach.
I am new here but have recently become passionate about looking into the issues of accessibility and eye-driven interfaces.
Most OS's allow you to use eye-tracking, but require additional expensive hardware. It is fine to some extent: the hardware is precise with all necessary sensors. But on the other hand, all consumer electronics devices (or most of them) already have built-in cameras!
I know that there are partially similar projects, but most of them focus on scientific and market research eye-tracking. I think that we should strive to bring more eye-driven interfaces to our laptops/phones/info-kiosks, especially after what vision pro and meta quest 3 presented. Differentiating our input interfaces increases accessibility for digital spaces and limits the number of people who may not be able to use their own devices to a full extent; additionally, it gives very natural interfaces for rest.
Hence, after 4 months of hacking and putting bits and pieces together, I have designed a working solution. It is not perfect yet, but you can see demos here:
It may take some patience yet to adjust yourself to the interface, but I am super happy to get feedback. The main rule is that it is an eye-movement-driven cursor, not strictly a gaze tracker.
Tracker can be deployed to any website/webapp via our API, or you can use a desktop app to deploy it for OS (that requires a bit more integration from my side, but as the one lone engineer, it is a simple issue of limited time and resources).
Additionally, there is a main link to our polar.sh site but if you want to contact me directly:
piotr.walas@eyegestures.com
or
contact@eyegestures.com
I built this self-hosted server template to help you focus on what truly matters in your app—like algorithms and core functionality—while handling user management and setup for you.
Initially, this was a personal solution. I often found that user management and database configuration were extra hassles that slowed down my development process. I wanted to solve this once and for all, so I could stay focused on my ideas.
If this sounds interesting, feel free to check out the website!