ETWEB – Eye Tracking for The Web @ ETRA 2019
avatar

ETWEB – Eye Tracking for The Web, as a conference track at ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications, June 25-28, 2019 in Denver, Colorado, USA http://etra.acm.org/2019/etweb.html.

The Web offers rich information and services. Mostly, users access these on Web sites through interaction with graphical interfaces defined by Web page documents. The design and the interaction with Web pages thus have a considerable impact on our daily lives. Therefore, especially both the usability and the accessibility of Web pages are relevant areas of research to make the Web more useful.

The eye gaze is a strong indicator for attention, which provides insights into how a user perceives an interface and helps analysts to assess the user experience through. Researchers and companies are interested to assess the attention on certain portions of a Web page, e.g., which sections are read, glanced or skipped by the users, and the Web page usability in general. The analysis requires an accurate association between the coordinates of the recorded gaze data and a representation of the Web page as a stimulus. The content on a Web page might be a dynamic stimulus, which is difficult to synchronize between multiple users because of its interactive nature. Hence, the focus of ETWEB is to encourage research on accurate stimulus representations of dynamic Web, mapping of gaze data and visualization methods to analyze the usability of Web pages, and to understand Web browsing behavior of end users.

Furthermore, eye tracking research would also be beneficial to Web users with different abilities. For instance, Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web and that they can contribute to the Web. However, the restricted capability to use conventional input (mouse/keyboard/touch etc.) limits their ability to interact with the Web and thus excludes them from the digital information spaces. The applications of eye tracking can break the interaction barrier, and improve the quality of life of those with limited ability to communicate. However, most graphical user interfaces for Web access are not designed for use with eye tracking devices, which often have limited accuracy or may require unconventional selection techniques that interfere with access to information. In that regard, we encourage submissions that explore the adaptation mechanism of Web interfaces for gaze interaction (i.e., using gaze signals obtained from eye tracking devices to control the Web application environment).

Topics

ETWEB will cover topics that are related to Web (interface semantics extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowdsourcing, etc.). We particularly welcome submissions that address the following topics with an emphasis on the relationship between eye tracking and the Web:

  • Novel methods for eye tracking in challenging Web scenarios
  • Website usability analysis techniques using eye movement data
  • Enable usability optimization with eye tracking on dynamic Web content
  • Enhance the user experience in the Web by enabling an easy and complete understanding of user behavior
  • Understanding eye-tracking data for re-engineering Web pages
  • Eye Tracking scanpath analysis techniques on Web Pages
  • Analyzing Web search and browsing behaviors through gaze pattern
  • Social media browsing behavior analysis
  • Correlating mouse clicks and gaze data with Web browsing behavior
  • Gaze-based Web usability studies via crowdsourcing approaches
  • Standardized metrics for evaluating interactions and usability
  • Corpus of eye tracking ground truth data on Web pages
  • Eye tracking interaction techniques to assist people with disabilities
  • Multimodal interaction with Web (gaze, mouse, voice, touch, EEG etc.)
  • Interactive annotation of gaze and Web stimulus data
  • Techniques to integrate eye gaze as an input element in Web development
  • Reports of eye tracking studies evaluating Web accessibility and usability

Submission

Authors are invited to submit their work complying with the ETRA short and long paper format. Long papers (8 pages) encourage more matured research with evaluation results. Short papers (4 pages) encourage work in progress, and also position papers identifying the challenges of gaze-based Web interaction and analysis. Papers should be submitted electronically in PDF format to ETWEB over the ETRA submission system. Please select the ETWEB track. Accepted papers are considered regular ETRA publications and will be part of the ETRA proceedings (ACM digital library). A footnote on the first page will indicate that your paper was part of ETWEB. The submission process follows that of ETRA http://etra.acm.org/2019/authors.html <http://etra.acm.org/2019/authors.html>, but with a separate program committee. At least one author of each accepted ETWEB paper must register for the ETRA conference. Participants will be free to attend all ETRA tracks.

ETWEB Important Dates

  • February 22, 2019 – Full papers and short papers due
  • March 22, 2019 – Author notifications
  • March 29, 2019 – Camera-ready papers due

Organizers

  • Chandan Kumar, Institute WeST, University of Koblenz, Germany
  • Raphael Menges, Institute WeST, University of Koblenz, Germany
  • Sukru Eraslan, METU Northern Cyprus Campus Program Committee
  • Alexandra Papoutsaki, Pomona College, USA
  • Jacek Gwizdka, University of Texas, USA
  • Scott MacKenzie, York University, Canada
  • Simon Harper, University of Manchester, UK
  • Caroline Jay, University of Manchester, UK
  • Victoria Yaneva, University of Wolverhampton, UK
  • Yeliz Yeşilada, METU Northern Cyprus Campus
  • Marco Porta, University of Pavia, Italy
  • Spiros Nikolopoulos, CERTH ITI, Greece
  • Korok Sengupta, University of Koblenz, Germany
  • Steffen Staab, University of Koblenz, Germany

Contact

Chandan Kumar, Institute for Web Science and Technologies, University of Koblenz, kumar@uni-koblenz.de
http://chandankumar.net

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.