COGAIN 2019
avatar

Submission deadline

  • Abstracts due: Jan 25th, 2019 (extended)       
  • Papers due: Jan 25th, 2019 (extended)   
  • Feedback: Feb 18th, 2019   
  • Rebuttals: Feb 25th, 2019   
  • Decisions: Mar  4th, 2019   
  • Camera-ready due: Mar 29th, 2019

CALL FOR PAPERS   

The Symposium on Communication by Gaze Interaction organized by the COGAIN Association (http://cogain.org) will be co-located with ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications. ETRA 2019 will take place in Denver, Colorado, June 25-28.   

Following the successful concept used at the COGAIN symposium 2018, also COGAIN 2019 will be organized as a “special session” at ETRA. By combining our efforts with ETRA, we hope to encourage a broader exchange of knowledge and experiences amongst the communities of researchers, developers, manufacturers, and users of eye trackers.   

We invite authors to prepare and submit short papers following the ETRA’s ACM format (http://etra.acm.org/2019/authors.html). Long papers are up to 8 pages (+ 2 additional pages for references). Short papers are up to 4 pages (+ 2 additional pages for references). During the submission process to ETRA 2019, you will be asked if you would like your paper to be presented at the COGAIN Symposium. All the accepted papers for the COGAIN Symposium will be published as part of the ETRA 2019 ACM Proceedings.   

The COGAIN Symposium focuses on all aspects of gaze interaction, with special emphasis on eye-controlled assistive technology. The symposium will present advances in these areas, leading to new capabilities in gaze interaction, gaze enhanced applications, gaze contingent devices etc. Topics of interest include all aspects of    gaze interaction and communication by gaze including, but not limited to

  • Eye-controlled assistive technology
  • Eye-typing
  • Gaze-contingent devices
  • Gaze-enhanced games
  • Gaze-controlled robots and vehicles
  • Gaze interaction with mobile devices
  • Gaze-controlled smart-home devices
  • Gaze interfaces for wearable computing
  • Gaze interaction in 3D (VR/AR/MR & real world)
  • Gaze interaction paradigms
  • Usability and UX evaluation of gaze-based interfaces
  • User context estimation from eye movements
  • Gaze-supported multimodal interaction (gaze with multitouch, mouse, gesture, etc.)   

The Program Committee will select the COGAIN 2019 best paper.

ORGANIZATION   

General co-chairs      

  • John  Paulin Hansen  [Technical University of Denmark, Denmark]
  • Päivi Majaranta      [Tampere University, Finland]     

Program co-chairs

  • Diako Mardanbegi     [Lancaster University, United Kingdom]
  • Ken Pfeuffer         [Bundeswehr University Munich, Germany]

============================================================== Please visit http://cogain2019.cogain.org for more information or contact us by
email to cogain2019@cogain.org ==============================================================

ETWEB – Eye Tracking for The Web @ ETRA 2019
avatar

ETWEB – Eye Tracking for The Web, as a conference track at ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications, June 25-28, 2019 in Denver, Colorado, USA http://etra.acm.org/2019/etweb.html.

The Web offers rich information and services. Mostly, users access these on Web sites through interaction with graphical interfaces defined by Web page documents. The design and the interaction with Web pages thus have a considerable impact on our daily lives. Therefore, especially both the usability and the accessibility of Web pages are relevant areas of research to make the Web more useful.

The eye gaze is a strong indicator for attention, which provides insights into how a user perceives an interface and helps analysts to assess the user experience through. Researchers and companies are interested to assess the attention on certain portions of a Web page, e.g., which sections are read, glanced or skipped by the users, and the Web page usability in general. The analysis requires an accurate association between the coordinates of the recorded gaze data and a representation of the Web page as a stimulus. The content on a Web page might be a dynamic stimulus, which is difficult to synchronize between multiple users because of its interactive nature. Hence, the focus of ETWEB is to encourage research on accurate stimulus representations of dynamic Web, mapping of gaze data and visualization methods to analyze the usability of Web pages, and to understand Web browsing behavior of end users.

Furthermore, eye tracking research would also be beneficial to Web users with different abilities. For instance, Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web and that they can contribute to the Web. However, the restricted capability to use conventional input (mouse/keyboard/touch etc.) limits their ability to interact with the Web and thus excludes them from the digital information spaces. The applications of eye tracking can break the interaction barrier, and improve the quality of life of those with limited ability to communicate. However, most graphical user interfaces for Web access are not designed for use with eye tracking devices, which often have limited accuracy or may require unconventional selection techniques that interfere with access to information. In that regard, we encourage submissions that explore the adaptation mechanism of Web interfaces for gaze interaction (i.e., using gaze signals obtained from eye tracking devices to control the Web application environment).

Topics

ETWEB will cover topics that are related to Web (interface semantics extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowdsourcing, etc.). We particularly welcome submissions that address the following topics with an emphasis on the relationship between eye tracking and the Web:

  • Novel methods for eye tracking in challenging Web scenarios
  • Website usability analysis techniques using eye movement data
  • Enable usability optimization with eye tracking on dynamic Web content
  • Enhance the user experience in the Web by enabling an easy and complete understanding of user behavior
  • Understanding eye-tracking data for re-engineering Web pages
  • Eye Tracking scanpath analysis techniques on Web Pages
  • Analyzing Web search and browsing behaviors through gaze pattern
  • Social media browsing behavior analysis
  • Correlating mouse clicks and gaze data with Web browsing behavior
  • Gaze-based Web usability studies via crowdsourcing approaches
  • Standardized metrics for evaluating interactions and usability
  • Corpus of eye tracking ground truth data on Web pages
  • Eye tracking interaction techniques to assist people with disabilities
  • Multimodal interaction with Web (gaze, mouse, voice, touch, EEG etc.)
  • Interactive annotation of gaze and Web stimulus data
  • Techniques to integrate eye gaze as an input element in Web development
  • Reports of eye tracking studies evaluating Web accessibility and usability

Submission

Authors are invited to submit their work complying with the ETRA short and long paper format. Long papers (8 pages) encourage more matured research with evaluation results. Short papers (4 pages) encourage work in progress, and also position papers identifying the challenges of gaze-based Web interaction and analysis. Papers should be submitted electronically in PDF format to ETWEB over the ETRA submission system. Please select the ETWEB track. Accepted papers are considered regular ETRA publications and will be part of the ETRA proceedings (ACM digital library). A footnote on the first page will indicate that your paper was part of ETWEB. The submission process follows that of ETRA http://etra.acm.org/2019/authors.html <http://etra.acm.org/2019/authors.html>, but with a separate program committee. At least one author of each accepted ETWEB paper must register for the ETRA conference. Participants will be free to attend all ETRA tracks.

ETWEB Important Dates

  • February 22, 2019 – Full papers and short papers due
  • March 22, 2019 – Author notifications
  • March 29, 2019 – Camera-ready papers due

Organizers

  • Chandan Kumar, Institute WeST, University of Koblenz, Germany
  • Raphael Menges, Institute WeST, University of Koblenz, Germany
  • Sukru Eraslan, METU Northern Cyprus Campus Program Committee
  • Alexandra Papoutsaki, Pomona College, USA
  • Jacek Gwizdka, University of Texas, USA
  • Scott MacKenzie, York University, Canada
  • Simon Harper, University of Manchester, UK
  • Caroline Jay, University of Manchester, UK
  • Victoria Yaneva, University of Wolverhampton, UK
  • Yeliz Yeşilada, METU Northern Cyprus Campus
  • Marco Porta, University of Pavia, Italy
  • Spiros Nikolopoulos, CERTH ITI, Greece
  • Korok Sengupta, University of Koblenz, Germany
  • Steffen Staab, University of Koblenz, Germany

Contact

Chandan Kumar, Institute for Web Science and Technologies, University of Koblenz, kumar@uni-koblenz.de
http://chandankumar.net

IEEE Virtual Reality – Joint Workshops on Perception-driven Graphics and Displays for VR and AR & Eye Tracking and Vision Augmentation
avatar

23-27 March 2019
Osaka, Japan

CALL FOR PAPERS

Important dates:
Workshop paper submission: January 25, 2019
Notification of acceptance: February 1, 2019
All deadlines are due at 9:00 pm Pacific Time (PDT/PST)

Website

https://sites.google.com/view/pergravarworkshop/home

Description

This year, we will be holding a series of workshops with the common topics of perception, graphics, and augmentation. As such, the PerGraVAR and VisAug workshops will be held back-to-back, with several shared sessions and/or keynotes.
We solicit original research papers in the area of perception-driven graphics and perceptual displays.

The goal of the PerGravAR workshop is the creation of a better understanding of the various techniques and systems that exploit limitations or address the potentials of the human visual system to create a more intense or comprehensible visual experience in VR and AR. While display technology progresses, pixel densities and the dynamic range increase. At the same time, refresh rates are getting higher and higher, and display latencies are continually reduced. Currently, novel technologies such as displays with a growing number of displayed views per pixel (ranging from stereo, multi-view to holographic or lightfield displays) are advancing beyond the prototype stage. Likewise, multi-layered displays and adaptable lenses are integrated into head-mounted devices. Prototypes for retinal displays and bionic contact lenses have begun to emerge. All these advances in display technologies will tighten the requirements on image synthesis techniques. This workshop aims for bringing together a group of experts to discuss perceptual findings, identify challenges and present the latest research in the field of perception-driven rendering and computational displays.

The VisAug workshop is designed to cover both eye tracking and vision augmentation technologies as they pertain to Augmented and Virtual Reality. Eye tracking is closely tied to a number of AR applications such as diagnosis, interaction with content, and enhancement of human vision. The workshop will include a review of many of these technologies, along with a series of presentations on state-of-the-art research, ranging from visualization to optical modification to vision correction in the fields of MR/AR/VR. The workshop will also include a panel/discussion session in which participants can engage with leading experts in the field. 

For these workshops, we expect researchers to submit early work, such as initial analyses of user studies, perceptual findings, experimental rendering techniques, or sketches for novel devices. Although position papers that comprise several pages and summarize a range of previous approaches (literature review), perceptual findings or experiences also fall inside the scope of the workshop. Papers should be between 2 and 6 pages in length and may cover one or more of the following topics:

For PerGravAR

  • Novel display devices for VR and AR
  • Rendering and information visualization methods that exploit perceptual issues
  • Rendering and information visualization methods that target specific perceptual potentials
  • Gaze-contingent rendering and interaction techniques
  • Studies that provide insights into perception and cognition processes
  • Saliency and attention models and findings
  • Perception-driven image metrics
  • Perceptual issues of image synthesis techniques
  • Just noticeable differences, signal thresholds, and biases
  • Validation methodologies, benchmarks and measurement methods, including eye tracking
  • Novel measurement and processing techniques such as autorefractors or wavefront sensing
  • Experimental designs and techniques for conducting user studies
  • Systems, findings and general issues related to:
    • Depth-of-Field
    • Vergence-accommodation conflict
    • Stereo disparity manipulation
    • Wide Field-of-Views

For VisAug

  • Vision assistance, correction, training, and enhancement-
  • Head-mounted display technologies
  • Oculography, eye tracking, and pupillometry
  • Extended vision/ Superhuman vision
  • Optics hardware and software

All submitted papers will go through a two-stage review process to guarantee the publication of high-quality papers. All accepted IEEE VR Workshops paper will be published electronically through the IEEE Digital Library. 

Submissions

Papers are to be submitted online through the Easychair system
PerGravAR
https://easychair.org/conferences/?conf=pergravar2019
VisAug
https://easychair.org/conferences/?conf=visaug2019

Best wishes,Martin Weier, Kaan Aksit, Jason Orlosky,  Yuta Itoh, Praneeth Chakravarthula and Chang Liu

PhD-course: Eye-tracking in social science research projects
avatar

The qualifications and skills obtained during master programs often hardly prepare students to conduct eye-tracking studies, to avoid potential pitfalls when using the eye-tracking equipment and to analyze the complex eye-tracking datasets. Especially in the beginning of a PhD project these challenges appear to be overwhelming. PhD students completing the course will gain an overview of research in the field of bottom-up and top-down attentional process and search in decision-making. We will give an overview on latest developments in the field, including learning and contextual biases in decision sequences and the evaluation of decision theories. From a practical perspective PhD students will get insight in the process of setting up eye-tracking experiments, conducting a first empirical study on their own and analyzing an eye-tracking dataset. PhD students will have the opportunity to use remote eye-tracking devices together with their own laptops and use the provided software to analyze their datasets. Based on this experience, students will be able to critically reflect their experimental work and improve the planning of their own future experiments. Moreover, PhD students will learn about ways of analyzing eye-tracking data, for example using multi-level regression models.

*More information can be found at: http://tinyurl.com/phdcourse-eyetracking*

Call for Participation: PhD Course on Using Eye-tracking in Social Science Research Projects
avatar

Course Coordinator:

  • Associate Professor Dr. Martin Meißner, University of Southern Denmark, Department of Environmental and Business Economics, Esbjerg, Denmark

Lecturers:

  • Associate Professor Dr. Martin Meißner, University of Southern Denmark, Department of Environmental and Business Economics, Esbjerg, Denmark.
  • Assistant Professor Dr. Jacob Orquin, Aarhus University, Department of Business Administration, Aarhus, Denmark.
  • Assistant Professor Dr. Jella Pfeiffer, Karlsruhe Institute of Technology, Institute of Information Systems and Marketing, Karlsruhe, Germany.
  • Assistant Professor Dr. Thies Pfeiffer, Bielefeld University, Cognitive Interaction Technology Center of Excellence, Bielefeld, Germany.

Time:

  • Monday, September 14th to Friday, September 18th, 2015.

Description:

Much of the rapid growth of research on attention and especially eye-tracking has been driven by the fast technological development in recent years and a sharp decline in the costs of eye-tracking equipment. Remote, head-mounted, portable and mobile devices will now be used in many PhD projects because it is possible to generate larger samples of respondents in new (decision) environments.

Eye-tracking makes it possible to track and study attentional processes in great detail, classically in front of computer screens but also in mobile contexts, for example when using digital devices, like smartphones or smartglasses (Google glasses, EPSON Moverio) for studying purchasing behavior in retail stores.

The qualifications and skills obtained during master programs often hardly prepare students to conduct eye-tracking studies, to avoid potential pitfalls when using the eye-tracking equipment and to analyze the complex eye-tracking datasets. Especially in the beginning of a PhD project these challenges appear to be overwhelming.

PhD students completing the course will gain an overview of research in the field of bottom-up and top-down attentional process and search in decision-making. We will give an overview on latest developments in the field, including learning and contextual biases in decision sequences and the evaluation of decision theories. From a practical perspective PhD students will get insight in the process of setting up eye-tracking experiments, conducting a first empirical study on their own and analyzing an eye-tracking dataset. PhD students will have the opportunity to use remote eye-tracking devices together with their own laptops and use the provided software to analyze their datasets. Based on this experience, students will be able to critically reflect their experimental work and improve the planning of their own future experiments. Moreover, PhD students will learn about ways of analyzing eye-tracking data, for example using multi-level regression models.

Course Content:

The following topics will be part of the course:

  • Eye-tracking basics
  • Visual attention and search in decision making
  • Eye-tracking measures and their meaning (pupil dilation, fixation duration, eye blinks, saccadic distances)
  • Handling and management of eye-tracking data
  • Mobile eye-tracking equipment and annotation of fixations
  • OpenSource eye-tracking software
  • Alternative process-tracing techniques (Mouselab, Think aloud)
  • Analysis of eye-tracking data: An overview of different analytical approaches and examples for the use of more advanced (multi-level) methods
  • Hands-on experiment with portable eye-tracking equipment (SMI Smart Glasses): Setup of a small experiment using low-frequency, portable eye-trackers to record data, analysis of the dataset, presentation of first results in class.
  • Hands-on mobile eye-tracking equipment: Track a short sequence with the mobile equipment

Course Format:

The course has a lecture/discussion format and a hands-on experimental component. The interactive lectures will focus on the theoretical background of visual attention and search in the context of decision-making. In a hands-on practical exercise PhD students will setup a small eye-tracking experi-ment and use eye-tracking equipment to record eye movements. Students can then use the provided open source software for analyzing the data as well as other (open source) statistical software package of their choice. Finally, they will present their first results in class. The practical part can take place in a classroom. PhD Students will be able to use their own laptops in combination with a portable plug-in low frequency eye-tracking device. Moreover, we will also bring mobile eye-tracking equipment to the class so that PhD students will get familiar with new mobile eye-tracking technologies, existing open source software and the potential pitfalls of these new devices.

Learning Objectives:

After completing the course, PhD students will have:

  • an understanding of problems associated with conducting eye-tracking experiments using different sorts of equipment.
  • an understanding of the data generating process.
  • an ability to assess the prospects and limits of their own empirical research.
  • an ability to setup eye-tracking experiments on their own avoiding serious pitfalls related to the use of eye-tracking technology.
  • an understanding of the various ways in which eye-tracking data can be analyzed.
  • an understanding of state-of-the-art theories of attention and search.

Prerequisites:

This PhD course is targeted for PhD students from business (particular marketing), psychology, experimental economic research, information systems and other social sciences, who are planning or starting an empirical research project using eye-tracking or other process-tracing approaches. Basic knowledge (master level) in statistics as well as knowledge in statistic software packages like SPSS, SAS, Stata, R or other programs is desirable but is not a precondition.

Evaluation:

Certificates of completion will be issued based on class attendance and participation, the submitted assignments, and an oral presentation.

Each student must submit a description (max. 2,500 words) of the (potential) eye-tracking or process-tracing part of his/her PhD project. The description should include: (1) a short introduction; (2) (preliminary) research question(s); (3) a detailed description of the data or data collection process; (4) a detailed description of the planned experiments; and (5) key references. During the PhD course each student will be asked to present: (a) a short description of his/her research project; (b) the relation of the PhD project to existing eye-tracking research, the theoretical background, and the chosen or planned experiments; (c) arguments why the proposed methodology to analyze the data is appropriate.

ECTS: 5
Teaching language: English
Fee: none

Participation:

To apply to the course, please send an e-mail – no later than 21.08.2015 – to Martin Meißner (meiss-ner@sam.sdu.dk).
Do you have questions about the course? Please contact Martin Meißner (meissner@sam.sdu.dk) or Jella Pfeiffer (jella.pfeiffer@kit.edu).