Can webcam eye tracking facilitate readability research?
At the Twenty-Fifth International Joint Conference on Artificial Intelligence, IJCAI 2016, Alexandra Papoutsaki, Patsorn Sangkloy, James Laskey, Nediyana Daskalova, Jeff Huang, and James Hays introduced “WebGazer: Scalable Webcam Eye Tracking Using User Interactions.” The paper detailed their research and development of WebGazer, a new approach to browser-based eye tracking for commonly available webcams available in laptops and mobile devices.
To develop WebGazer, the researchers conducted a large-scale remote online study and compared results to participants in the laboratory, concluding that this technology can be used to predict an individual’s gaze pattern.
WebGazer is a scalable technology that can lead to web application innovations and advance many research agendas. The authors discuss the numerous applications that webcam eye tracking can enable, including large-scale naturalistic usability studies. They “believe that this work takes a step towards ubiquitous online eye tracking, where scaling to millions of people in a privacy-preserving manner could lead to innovations in web applications and understanding web visitors.”
Implications for Readability
As the authors note, “Eye tracking is a common method for understanding human attention in psychology experiments, human-computer interaction studies, medical research, etc.” Most research utilizes eye-tracking technology in a lab setting. Webcam eye-tracking technology can allow researchers to evaluate readers remotely, allowing for studies for a larger and more diverse population. Supporting online research, this technology can facilitate readability studies during these times of remote learning and work due to COVID-19.
Readability Matters is looking forward to a future where WebGazer and other tools this team is developing can advance readability research. This technology measures smooth eye movements and saccades and could reveal additional meaningful information about the individual’s reading performance (e.g., skipped or repeated words, skipped lines, etc.). These technologies, combined with others, including audio measures, can be used to evaluate the impact of typographic settings (character shape, size, and spacing). The necessary insights provided are critical to delivering on the long-term goal of identifying a personalized reading format best for the individual reader.
Watch the demo here…
Webgazer is developed based on the research done initially at Brown University, with recent work at Pomona College. The project is open-source and hosted on GitHub. Visit the WebGazer website for more information.
WebGazer: Scalable Webcam Eye Tracking Using User Interactions
Alexandra Papoutsaki, Patsorn Sangkloy, James Laskey, Nediyana Daskalova, Jeff Huang, and James Hays
ABSTRACT
We introduce WebGazer, an online eye tracker that uses common webcams already present in laptops and mobile devices to infer the eye-gaze locations of web visitors on a page in real-time. The eye tracking model self-calibrates by watching web visitors interact with the web page and trains a mapping between features of the eye and positions on the screen. This approach aims to provide a natural experience to everyday users that is not restricted to laboratories and highly controlled user studies. WebGazer has two key components: a pupil detector that can be combined with any eye detection library and a gaze estimator using regression analysis informed by user interactions. We perform a large remote online study and a small in-person study to evaluate WebGazer. The findings show that WebGazer can learn from user interactions and that its accuracy is sufficient for approximating the user’s gaze. As part of this paper, we release the first eye tracking library that can be easily integrated in any website for real-time gaze interactions, usability studies, or web research.
Access the full paper here
ABOUT WebGazer.js: WebGazer.js is an eye tracking library that uses common webcams to infer the eye-gaze locations of web visitors on a page in real time. The eye tracking model it contains self-calibrates by watching web visitors interact with the web page and trains a mapping between the features of the eye and positions on the screen. WebGazer.js is written entirely in JavaScript and with only a few lines of code can be integrated in any website that wishes to better understand their visitors and transform their user experience. WebGazer.js runs entirely in the client browser, so no video data needs to be sent to a server. WebGazer.js runs only if the user consents in giving access to their webcam.
Webgazer is developed based on the research done initially at Brown University, with recent work at Pomona College. The work of the calibration example file was developed in the context of a course project with the aim to improve the feedback of WebGazer, proposed by Dr. Gerald Weber and his team Dr. Clemens Zeidler and Kai-Cheung Leung.
This research is supported by NSF grants IIS-1464061, IIS-1552663, a Seed Award from the Center for Vision Research at Brown University, and the Brown University Salomon Award.