*********************************************************************** * Fifth Workshop on Eye Tracking and Visualization (ETVIS 2020) * * June 2-5, 2020, Stuttgart, Germany * * in conjunction with ACM ETRA 2020 * * https://etra.acm.org/2020 * *********************************************************************** * Contact: firstname.lastname@example.org * *********************************************************************** ________________ IMPORTANT DATES ________________ ● Submission Due: Feb 10, 2020 ● Notification: Mar 19, 2020 ● Camera Ready: Apr 2, 2020 All accepted papers will be published by ACM as part of the Short Paper Proceedings of ETRA 2020. ______ SCOPE ______ Technological advances in computer vision algorithms and sensor hardware have greatly reduced the implementational and financial costs of eye tracking. Thus, it is unsurprising to witness a significant increase in its use as a research tool in fields beyond the traditional domains of biological vision, psychology, and neuroscience, in particular, in visualization and human-computer interaction research. One of the key challenges lies in the analysis, interaction, and visualization of complex spatio-temporal datasets of gaze behavior, which is further complicated by complementary datasets such as semantic labels, user interactions and/or accompanying physiological sensor recordings. Ultimately, the research objective is to allow eye tracking data to be effectively interpreted in terms of the observer’s decision-making and cognitive processes. To achieve this, it is necessary to draw upon our current understanding of gaze-behavior across various and related fields, from vision and cognition to visualization. All together eye tracking is an important field to be understood, be it in the sense of data analysis and visualization, interaction, or user-based evaluation of visualization. _______ TOPICS _______ Manuscripts are solicited on the following topics with an emphasis on the relationship between eye tracking and visualization, including but not limited to the following: - Visualization techniques for eye-movement data (inc. spatio-temporal visualization, evolution of gaze patterns, visual analysis of individual behavior, 2D vs. 3D representations of eye-movement data) - Visual analytics of gaze behavior (inc. visual data mining, aggregation, clustering techniques, and metrics for eye-movement data) - Eye-movement data provenance - Standardized metrics for evaluating gaze interactions with visualization - Cognitive models for gaze interactions with visualizations - Novel methods for eye tracking in challenging visualization scenarios - Uncertainty visualization of gaze data - Interactive annotation of gaze and stimulus data - Systems for the visual exploration of eye-movement data - Eye-tracking studies that evaluate visualization or visual analytics - Eye tracking in non-WIMP visualization environments, including mobile eye tracking, mobile devices, and large displays - Visualization of eye-tracking data in mixed and virtual reality (3df & 6df, 360°) - Visualization applications that rely on eye tracking as an input parameter ____________ SUBMISSIONS ____________ Authors are invited to submit original work complying with the ETRA SHORT PAPER format (up to 4 pages + references). Papers should be submitted electronically in PDF format to ETVIS through the ETRA submission system: https://new.precisionconference.com/user/login?society=etra Also ensure that the Author Guidelines (https://www.siggraph.org//learn/instructions-authors for SIG sponsored events [sigconf]) are met prior to submission. ________ ORGANIZERS ________ ● Kuno Kurzhals, ETH Zurich, kunok[at]ethz.ch ● Sophie Stellmach, Microsoft Research, sostel[at]microsoft.com ● Vsevolod Peysakhovich, ISAE-SUPAERO, vsevolod.peysakhovich[at]isae-supaero.fr
ETWEB - Eye Tracking for The Web as a co-located event at the ACM Symposium on Eye Tracking Research & Applications, ETRA 2020, June 2-5, 2020 in Stuttgart, Germany. http://etra.acm.org/2020/etweb.html Important Dates February 15 - Paper submission March 10 - Feedback March 15 - Rebuttals March 22 - Decisions April 2 - Camera Ready Scope The Web offers rich information and services that have considerable impact on our daily lives. Enhancing the usability and the accessibility of Web interaction are relevant areas of research to make the Web more useful for end-users. ETWEB will cover topics that are related to Web (interface semantics extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowdsourcing, etc.). We particularly welcome submissions that address the following topics with an emphasis on the relationship between eye tracking and the Web: Web site usability analysis techniques using eye movement data Standardized metrics for evaluating interactions and usability Enable usability optimization with eye tracking on highly dynamic Web content Enhance the user experience in the Web by enabling an easy and complete understanding of a user and her behavior Understanding eye-tracking data for re-engineering Web pages Gaze data visualization on Web stimulus Analyzing Web search and browsing behaviors through gaze pattern Social media browsing behavior analysis Correlating mouse clicks and gaze data with Web browsing behavior Gaze-based Web usability studies via crowdsourcing approaches Corpus of eye tracking ground truth data on Web pages Eye tracking interaction techniques to assist people with disabilities Multimodal interaction with Web (gaze, mouse, voice, touch, EEG, etc.) Interactive annotation of gaze and Web stimulus data Techniques to integrate eye gaze as an input element in Web development Reports of eye tracking studies evaluating Web accessibility and usability Submission Authors are invited to submit their work complying with the ETRA short and long paper format http://etra.acm.org/2020/cfp.html. Long papers (6-8 pages) encourage more matured research with evaluation results. Short papers (2-4 pages) encourage work in progress, demo papers, and also position papers identifying the challenges of gaze-based Web interaction and analysis. References do not count to page limit. Papers should be submitted electronically in PDF format to ETWEB over the ETRA Precision Conference System (PCS) https://new.precisionconference.com/user/login?society=etra submission system. At least one author of each accepted ETWEB paper must register for the ETRA conference. Organizers Chandan Kumar, University of Koblenz, Germany Raphael Menges, University of Koblenz, Germany Yeliz Yeşilada, METU Northern Cyprus Campus
*First Call for Papers** **ET4S 2020 – Eye Tracking for Spatial Research 2020* co-located event at ETRA 2020 June 02-05, 2020, in Stuttgart, Germany http://www.spatialeyetracking.org/ email@example.com Eye tracking has become a popular method for investigating research questions related to geographic space and spatial data. This includes studies on how people interact with geographic information systems, studies on how space is perceived in decision situations, and using gaze as an input modality for spatial human-computer interaction. As an event co-located with ETRA 2020, Eye Tracking for Spatial Research (ET4S) aims to bring together researchers from different fields who have a common interest in using eye tracking for research questions related to visuospatial information processing and spatial decision-making. After four successful ET4S events in 2013, 2014, 2018, and 2019, the 5th edition of ET4S will be organized as a co-located event at ETRA 2020 (https://etra.acm.org/2020/index.html), the ACM Symposium on Eye Tracking Research & Applications. Topics of interest include, but are not limited to: * Gaze-based Interaction with Maps and Other Spatial Visualizations * Evaluation of Cartographic and Other Spatial Visualizations with Eye Tracking * Gaze-aware Mobile Assistance and Location-based Services * Navigation Studies, Wayfinding, and Eye Tracking * Eye Tracking in Traffic Research (e.g., Car Navigation, Public Transport, Aviation) * Visual Perception and Exploration of (Indoor and Outdoor) Space * Landscape Perception * Visuospatial Cognition Research * Gaze During Spatial and Spatio-temporal Decision Making * Spatio-temporal Analysis and Visualization of Eye Tracking Data * Individual and Group Differences in Visuospatial Information Processing * Eye Tracking in XR (VR/MR/AR) for Spatial Research * Eye Tracking in 3D Space * Multi-user Eye Tracking for Collaborative Spatial Decision Making *Submission Guidelines * We call for regular and work-in-progress papers. Your submissions should be prepared following the sigconf instructions. To prepare the content of the PDF file, SIGGRAPH encourages authors to use the following templates for LaTeX and Word, respectively: https://www.acm.org/publications/authors/submissions. At least one author of each accepted ET4S paper must register for the ETRA conference. /*Regular papers*/ should present original and novel work demonstrating advances in methodological, theoretical, experimental or practical aspects of ET4S topics on up to 6 pages (+2 pages maximum for references only). Regular papers are expected to present fully analyzed data or accomplished work. All accepted regular papers are presented orally at the ET4S event. /*Work-in-progress papers*/ are suitable for work on ET4S-related topics in a stage beyond planning-only, but where feedback and discussion at the ET4S event can make a contribution (e.g., a pilot study for a planned study has been performed, or a conducted but not fully analyzed study). Work-in-progress papers can be up to 4 pages long (+2 pages maximum for references only). Work-in-progress papers will be presented at the ET4S event either with a short oral presentation or as a poster. All papers should be submitted through Precision Conference (PCS, https://new.precisionconference.com/user/login?society=etra) and are subject to a single review round. In PCS, please make sure you choose the track “ETRA 2020 Eye Tracking for Spatial Research (ET4S)“. *Important Dates * Submission deadline February 21st, 2020 Reviews and notifications March 26th, 2020 Camera-ready deadline April 2nd, 2020 *ET4S Program Committee * * Gennady Andrienko (Fraunhofer IAIS/City University London, Germany/UK) * Roman Bednarik (University of Eastern Finland, Finland) * Annina Brügger (University of Zurich, Switzerland) * Florian Daiber (German Research Center for Artificial Intelligence (DFKI), Germany) * Weihua Dong (Beijing Normal University, China) * Fabian Göbel (ETH Zurich, Switzerland) * Amy Griffin (Royal Melbourne Institute of Technology, Australia) * Krzysztof Krejtz (SWPS University of Social Sciences and Humanities, Poland) * Jakub Krukar (ifgi, University of Münster, Germany) * Thomas Kübler (University of Tübingen, Germany) * Tiffany C.K. Kwok (ETH Zurich, Switzerland) * Bernd Ludwig (University of Regensburg, Germany) * Vsevolod Peysakhovich (Institut supérieur de l’aéronautique et de l’espace (ISAE-SUPAERO), France) * Ken Pfeuffer (Bundeswehr University Munich, Germany) * Martin Raubal (ETH Zurich, Switzerland) * Anthony Robinson (Pennsylvania State University, USA) * David Rudi (ETH Zurich, Switzerland) * Artemis Skarlatidou (University College London, UK) * Yanxia Zhang (FX Palo Alto Laboratory, USA) *ET4S Organizers * Peter Kiefer, ETH Zurich, Switzerland Arzu Çöltekin, University of Applied Sciences and Arts Northwestern Switzerland, Switzerland Rul von Stülpnagel, University of Freiburg, Germany Andrew T. Duchowski, Clemson University, SC, USA Ioannis Giannopoulos, TU Vienna, Austria *Contact * firstname.lastname@example.org
The Seventh International Workshop on Eye Movements in Programming (EMIP) will be held on Tuesday, 02 June 2020 in Stuttgart, Germany. It is co-located with the 12th ACM Symposium on Eye Tracking Research and Applications (ETRA 2020, http://etra.acm.org/2020/). The study of eye gaze data has great potential for research in computer programming, computing education, and software engineering practice. The Seventh International Workshop on Eye Movements in Programming (EMIP 2020) will again focus on advancing the methodological, theoretical, and applied aspects of eye movements in programming. The goal of the workshop is to further develop the methodology of using eye gaze tracking for programming, both theoretically and in applications. What can gaze behavior tell us about cognitive processes during programming? This question enables us to understand the role of human factors involved in programming. Website: http://emipws.org/emip-2020-call-for-papers/ Topics of Interest We invite contributions analyzing gaze behavior of activities related to programming, such as code reading and debugging, social aspects, vision, and educational perspectives. These may include, but are not limited to, the role of emotions in programming, vision-based models, readability, and new theories of program comprehension. Contributions are expected to present implications to industrial programming practice or programming education. Specific topics of interest include, but are not limited to: - Practical methods of using eye tracking - Identification and analysis of appropriate data abstractions - Models of cognition about software development - Effects of text-based, graphical, or diagram-based program representations - Effects of syntax or language features, as well as programming paradigms - Identification and analysis of behaviors and strategies of learners' reading, writing, and debugging code, acquiring new domains and skills, longitudinal growth - Challenges for learners or software engineers (e.g., obstacles to learning or accomplishing tasks) - Applications for eye tracking, e.g., software engineering tasks, such as program comprehension, debugging, requirements traceability, change tracking - Development and evaluation of tools and processes for working with eye tracking - Development and evaluation of visualizations for static and dynamic program execution - Applications offering programming assistance or accessibility using eye tracking devices, data, and analyses - Combinations of eye tracking with other sensing modalities, such as fMRI, EEG, or fNIRS - Multi-person eye tracking, e.g., during pair programming or collaborative problem solving - Eye gaze datasets and source code amenable to eye gaze studies - Analyses of pre-existing eye gaze datasets - Development of platforms, tools, and methods which enable reproducible experiments Submissions and Presentations One half of the workshop will be devoted to presenting new research results. The other will focus on facilitating discussion, teaching practical skills, and growing the community. Furthermore, we will have a hands-on demo session, in which participants can use eye trackers, explore promising analytical pipelines, and see potential outcomes of eye tracking studies. We invite short papers contributions (up to 4 pages, without references). Submissions must be written in English. We use ETRA's PCS system for submission handling. To submit a paper, please visit: https://new.precisionconference.com/user/login?society=etra, select "Society: ETRA, Conference: ETRA2020, Track: ETRA 2020 Workshop - Eye Movements in Programming EMIP". Your submissions should be prepared following the sigconf instructions (https://www.siggraph.org/learn/instructions-authors/). You can find detailed information for Word users on the ETRA's submission process page (http://etra.acm.org/2020/submissionprocess.html). Each submission will be reviewed by at least two members of the program committee. All accepted papers will be published in ETRA's short paper proceedings in the ACM Digital Library. If a submission is accepted, at least one author of the paper is required to attend the workshop and present the paper in person. Important Dates - Deadline for papers: February 20th, 2020 - Notification to authors: March 20th, 2020 - Camera-ready deadline: April 1st, 2020 - Workshop: Tuesday, June 2nd, 2020 Workshop Organizers - Bonita Sharif, University of Nebraska - Lincoln - Norman Peitek, Leibniz Institute for Neurobiology - Marjaana Puurtinen, University of Turku
ET-MM: Workshop on Eye Tracking for Quality of Experience in Multimedia June 2, 2020, University of Stuttgart, Germany co-located with ETRA 2020 URL: http://www.mmsp.uni-konstanz.de/et-mm/ Multimedia applications often strive to deliver high perceptual quality to their users. Fundamentally, the perceptual quality of multimedia is the fidelity of multimedia that supports the users in their task objectives. This can vary immensely across individuals and tasks. Nonetheless, there are physiological constraints to human vision and eye tracking provides unprecedented insight into how users seek out and process visual information. This is particularly relevant, but not limited to, visual media such as images and videos. Thus, eye tracking provides a tool to design and evaluate multimedia in a way that accommodates user perception and, hence, improve the use experience. In fact, eye trackers are increasingly integrated into mobile computing systems, from laptops to smartphones to smart TVs. Thus, we expect eye tracking to deepen our understanding of how humans perceive and interact with existing multimedia systems, which could lead to innovative multimedia applications that adapt to their users' gaze. This workshop explores how eye tracking can help to quantify and analyze perceptual aspects of visual multimedia, such as saliency and quality. ET-MM focuses more on data-driven algorithms and technology, including gaze based interfaces for control of multimedia applications. We seek contributions on, but not limited to, the following topics: * Eye tracking for visual quality and aesthetics prediction. * Eye tracking for image and video saliency. * Multimedia datasets which include eye tracking information. * Impact of quality degradations on gaze paths. * Eye tracking based perceptual image/video coding. * Eye movement in multimedia learning. * Gaze based interfaces and application control. * Eye tracking to assess reliability in user studies. * Multi-user interaction (but not for games) Important Dates * March 06, 2020 Paper submission deadline. * March 19, 2020 Notification of acceptance. * April 02, 2020 Camera-ready deadline. Support * SFB-TRR 161 Quantitative Methods for Visual Computing Organizers: Dietmar Saupe (University of Konstanz, DE) Hantao Liu (University of Cardiff, Wales, UK) Lewis Chuang (LMU Munich, DE)
ESSEM brings together internationally renowned researchers to teach students of all levels in the scientific foundations of eye movement research and the design and analysis of eye movement studies in basic (psychology, neuroscience), clinical (e.g., psychiatry, neurology) and industrial (e.g., economics, human-computer interface) settings.
ESSEM 2020 is organised by Christoph Klein (Freiburg/Cologne) and Ulrich Ettinger (Bonn) and will be held at the Department of Child and Adolescent Psychiatry, University of Freiburg, Germany, from 7th to 12th September 2020.
The course fee of €370 includes participation in ESSEM as well as catering (lunch, tea/coffee) during the summer school.
Applications including your CV, a letter of motivation and a 300 words abstract on the research you are going to present during the ESSEM poster session should be sent in one document to email@example.com. The deadline for applications is 28th February 2020 23:59 CET.
For further information, please see attached flyer, visit www.essem.info or email us on firstname.lastname@example.org.
Please feel free to forward this call to students or colleagues who may be interested in participating!
Christoph Klein & Ulrich Ettinger
- Website: http://cogain2019.cogain.org
- Where : Denver, Colorado
- When : June 25-28th, 2019 as part of ETRA 2019 — http://etra.acm.org/2019/
- Abstracts due: Jan 25th, 2019 (extended)
- Papers due: Jan 25th, 2019 (extended)
- Feedback: Feb 18th, 2019
- Rebuttals: Feb 25th, 2019
- Decisions: Mar 4th, 2019
- Camera-ready due: Mar 29th, 2019
CALL FOR PAPERS
The Symposium on Communication by Gaze Interaction organized by the COGAIN Association (http://cogain.org) will be co-located with ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications. ETRA 2019 will take place in Denver, Colorado, June 25-28.
Following the successful concept used at the COGAIN symposium 2018, also COGAIN 2019 will be organized as a “special session” at ETRA. By combining our efforts with ETRA, we hope to encourage a broader exchange of knowledge and experiences amongst the communities of researchers, developers, manufacturers, and users of eye trackers.
We invite authors to prepare and submit short papers following the ETRA’s ACM format (http://etra.acm.org/2019/authors.html). Long papers are up to 8 pages (+ 2 additional pages for references). Short papers are up to 4 pages (+ 2 additional pages for references). During the submission process to ETRA 2019, you will be asked if you would like your paper to be presented at the COGAIN Symposium. All the accepted papers for the COGAIN Symposium will be published as part of the ETRA 2019 ACM Proceedings.
The COGAIN Symposium focuses on all aspects of gaze interaction, with special emphasis on eye-controlled assistive technology. The symposium will present advances in these areas, leading to new capabilities in gaze interaction, gaze enhanced applications, gaze contingent devices etc. Topics of interest include all aspects of gaze interaction and communication by gaze including, but not limited to
- Eye-controlled assistive technology
- Gaze-contingent devices
- Gaze-enhanced games
- Gaze-controlled robots and vehicles
- Gaze interaction with mobile devices
- Gaze-controlled smart-home devices
- Gaze interfaces for wearable computing
- Gaze interaction in 3D (VR/AR/MR & real world)
- Gaze interaction paradigms
- Usability and UX evaluation of gaze-based interfaces
- User context estimation from eye movements
- Gaze-supported multimodal interaction (gaze with multitouch, mouse, gesture, etc.)
The Program Committee will select the COGAIN 2019 best paper.
- John Paulin Hansen [Technical University of Denmark, Denmark]
- Päivi Majaranta [Tampere University, Finland]
- Diako Mardanbegi [Lancaster University, United Kingdom]
- Ken Pfeuffer [Bundeswehr University Munich, Germany]
============================================================== Please visit http://cogain2019.cogain.org for more information or contact us by
email to email@example.com ==============================================================
ETWEB – Eye Tracking for The Web, as a conference track at ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications, June 25-28, 2019 in Denver, Colorado, USA http://etra.acm.org/2019/etweb.html.
The Web offers rich information and services. Mostly, users access these on Web sites through interaction with graphical interfaces defined by Web page documents. The design and the interaction with Web pages thus have a considerable impact on our daily lives. Therefore, especially both the usability and the accessibility of Web pages are relevant areas of research to make the Web more useful.
The eye gaze is a strong indicator for attention, which provides insights into how a user perceives an interface and helps analysts to assess the user experience through. Researchers and companies are interested to assess the attention on certain portions of a Web page, e.g., which sections are read, glanced or skipped by the users, and the Web page usability in general. The analysis requires an accurate association between the coordinates of the recorded gaze data and a representation of the Web page as a stimulus. The content on a Web page might be a dynamic stimulus, which is difficult to synchronize between multiple users because of its interactive nature. Hence, the focus of ETWEB is to encourage research on accurate stimulus representations of dynamic Web, mapping of gaze data and visualization methods to analyze the usability of Web pages, and to understand Web browsing behavior of end users.
Furthermore, eye tracking research would also be beneficial to Web users with different abilities. For instance, Web accessibility means that people with disabilities can perceive, understand, navigate, and interact with the Web and that they can contribute to the Web. However, the restricted capability to use conventional input (mouse/keyboard/touch etc.) limits their ability to interact with the Web and thus excludes them from the digital information spaces. The applications of eye tracking can break the interaction barrier, and improve the quality of life of those with limited ability to communicate. However, most graphical user interfaces for Web access are not designed for use with eye tracking devices, which often have limited accuracy or may require unconventional selection techniques that interfere with access to information. In that regard, we encourage submissions that explore the adaptation mechanism of Web interfaces for gaze interaction (i.e., using gaze signals obtained from eye tracking devices to control the Web application environment).
ETWEB will cover topics that are related to Web (interface semantics extraction, interaction adaptation, etc.) and eye tracking (attention visualization, crowdsourcing, etc.). We particularly welcome submissions that address the following topics with an emphasis on the relationship between eye tracking and the Web:
- Novel methods for eye tracking in challenging Web scenarios
- Website usability analysis techniques using eye movement data
- Enable usability optimization with eye tracking on dynamic Web content
- Enhance the user experience in the Web by enabling an easy and complete understanding of user behavior
- Understanding eye-tracking data for re-engineering Web pages
- Eye Tracking scanpath analysis techniques on Web Pages
- Analyzing Web search and browsing behaviors through gaze pattern
- Social media browsing behavior analysis
- Correlating mouse clicks and gaze data with Web browsing behavior
- Gaze-based Web usability studies via crowdsourcing approaches
- Standardized metrics for evaluating interactions and usability
- Corpus of eye tracking ground truth data on Web pages
- Eye tracking interaction techniques to assist people with disabilities
- Multimodal interaction with Web (gaze, mouse, voice, touch, EEG etc.)
- Interactive annotation of gaze and Web stimulus data
- Techniques to integrate eye gaze as an input element in Web development
- Reports of eye tracking studies evaluating Web accessibility and usability
Authors are invited to submit their work complying with the ETRA short and long paper format. Long papers (8 pages) encourage more matured research with evaluation results. Short papers (4 pages) encourage work in progress, and also position papers identifying the challenges of gaze-based Web interaction and analysis. Papers should be submitted electronically in PDF format to ETWEB over the ETRA submission system. Please select the ETWEB track. Accepted papers are considered regular ETRA publications and will be part of the ETRA proceedings (ACM digital library). A footnote on the first page will indicate that your paper was part of ETWEB. The submission process follows that of ETRA http://etra.acm.org/2019/authors.html <http://etra.acm.org/2019/authors.html>, but with a separate program committee. At least one author of each accepted ETWEB paper must register for the ETRA conference. Participants will be free to attend all ETRA tracks.
ETWEB Important Dates
- February 22, 2019 – Full papers and short papers due
- March 22, 2019 – Author notifications
- March 29, 2019 – Camera-ready papers due
- Chandan Kumar, Institute WeST, University of Koblenz, Germany
- Raphael Menges, Institute WeST, University of Koblenz, Germany
- Sukru Eraslan, METU Northern Cyprus Campus Program Committee
- Alexandra Papoutsaki, Pomona College, USA
- Jacek Gwizdka, University of Texas, USA
- Scott MacKenzie, York University, Canada
- Simon Harper, University of Manchester, UK
- Caroline Jay, University of Manchester, UK
- Victoria Yaneva, University of Wolverhampton, UK
- Yeliz Yeşilada, METU Northern Cyprus Campus
- Marco Porta, University of Pavia, Italy
- Spiros Nikolopoulos, CERTH ITI, Greece
- Korok Sengupta, University of Koblenz, Germany
- Steffen Staab, University of Koblenz, Germany
23-27 March 2019
CALL FOR PAPERS
Workshop paper submission: January 25, 2019
Notification of acceptance: February 1, 2019
All deadlines are due at 9:00 pm Pacific Time (PDT/PST)
This year, we will be holding a series of workshops with the common topics of perception, graphics, and augmentation. As such, the PerGraVAR and VisAug workshops will be held back-to-back, with several shared sessions and/or keynotes.
We solicit original research papers in the area of perception-driven graphics and perceptual displays.
The goal of the PerGravAR workshop is the creation of a better understanding of the various techniques and systems that exploit limitations or address the potentials of the human visual system to create a more intense or comprehensible visual experience in VR and AR. While display technology progresses, pixel densities and the dynamic range increase. At the same time, refresh rates are getting higher and higher, and display latencies are continually reduced. Currently, novel technologies such as displays with a growing number of displayed views per pixel (ranging from stereo, multi-view to holographic or lightfield displays) are advancing beyond the prototype stage. Likewise, multi-layered displays and adaptable lenses are integrated into head-mounted devices. Prototypes for retinal displays and bionic contact lenses have begun to emerge. All these advances in display technologies will tighten the requirements on image synthesis techniques. This workshop aims for bringing together a group of experts to discuss perceptual findings, identify challenges and present the latest research in the field of perception-driven rendering and computational displays.
The VisAug workshop is designed to cover both eye tracking and vision augmentation technologies as they pertain to Augmented and Virtual Reality. Eye tracking is closely tied to a number of AR applications such as diagnosis, interaction with content, and enhancement of human vision. The workshop will include a review of many of these technologies, along with a series of presentations on state-of-the-art research, ranging from visualization to optical modification to vision correction in the fields of MR/AR/VR. The workshop will also include a panel/discussion session in which participants can engage with leading experts in the field.
For these workshops, we expect researchers to submit early work, such as initial analyses of user studies, perceptual findings, experimental rendering techniques, or sketches for novel devices. Although position papers that comprise several pages and summarize a range of previous approaches (literature review), perceptual findings or experiences also fall inside the scope of the workshop. Papers should be between 2 and 6 pages in length and may cover one or more of the following topics:
- Novel display devices for VR and AR
- Rendering and information visualization methods that exploit perceptual issues
- Rendering and information visualization methods that target specific perceptual potentials
- Gaze-contingent rendering and interaction techniques
- Studies that provide insights into perception and cognition processes
- Saliency and attention models and findings
- Perception-driven image metrics
- Perceptual issues of image synthesis techniques
- Just noticeable differences, signal thresholds, and biases
- Validation methodologies, benchmarks and measurement methods, including eye tracking
- Novel measurement and processing techniques such as autorefractors or wavefront sensing
- Experimental designs and techniques for conducting user studies
- Systems, findings and general issues related to:
- Vergence-accommodation conflict
- Stereo disparity manipulation
- Wide Field-of-Views
- Vision assistance, correction, training, and enhancement-
- Head-mounted display technologies
- Oculography, eye tracking, and pupillometry
- Extended vision/ Superhuman vision
- Optics hardware and software
All submitted papers will go through a two-stage review process to guarantee the publication of high-quality papers. All accepted IEEE VR Workshops paper will be published electronically through the IEEE Digital Library.
Papers are to be submitted online through the Easychair system
Best wishes,Martin Weier, Kaan Aksit, Jason Orlosky, Yuta Itoh, Praneeth Chakravarthula and Chang Liu
The qualifications and skills obtained during master programs often hardly prepare students to conduct eye-tracking studies, to avoid potential pitfalls when using the eye-tracking equipment and to analyze the complex eye-tracking datasets. Especially in the beginning of a PhD project these challenges appear to be overwhelming. PhD students completing the course will gain an overview of research in the field of bottom-up and top-down attentional process and search in decision-making. We will give an overview on latest developments in the field, including learning and contextual biases in decision sequences and the evaluation of decision theories. From a practical perspective PhD students will get insight in the process of setting up eye-tracking experiments, conducting a first empirical study on their own and analyzing an eye-tracking dataset. PhD students will have the opportunity to use remote eye-tracking devices together with their own laptops and use the provided software to analyze their datasets. Based on this experience, students will be able to critically reflect their experimental work and improve the planning of their own future experiments. Moreover, PhD students will learn about ways of analyzing eye-tracking data, for example using multi-level regression models.
*More information can be found at: http://tinyurl.com/phdcourse-eyetracking*