23-27 March 2019
CALL FOR PAPERS
Workshop paper submission: January 25, 2019
Notification of acceptance: February 1, 2019
All deadlines are due at 9:00 pm Pacific Time (PDT/PST)
This year, we will be holding a series of workshops with the common topics of perception, graphics, and augmentation. As such, the PerGraVAR and VisAug workshops will be held back-to-back, with several shared sessions and/or keynotes.
We solicit original research papers in the area of perception-driven graphics and perceptual displays.
The goal of the PerGravAR workshop is the creation of a better understanding of the various techniques and systems that exploit limitations or address the potentials of the human visual system to create a more intense or comprehensible visual experience in VR and AR. While display technology progresses, pixel densities and the dynamic range increase. At the same time, refresh rates are getting higher and higher, and display latencies are continually reduced. Currently, novel technologies such as displays with a growing number of displayed views per pixel (ranging from stereo, multi-view to holographic or lightfield displays) are advancing beyond the prototype stage. Likewise, multi-layered displays and adaptable lenses are integrated into head-mounted devices. Prototypes for retinal displays and bionic contact lenses have begun to emerge. All these advances in display technologies will tighten the requirements on image synthesis techniques. This workshop aims for bringing together a group of experts to discuss perceptual findings, identify challenges and present the latest research in the field of perception-driven rendering and computational displays.
The VisAug workshop is designed to cover both eye tracking and vision augmentation technologies as they pertain to Augmented and Virtual Reality. Eye tracking is closely tied to a number of AR applications such as diagnosis, interaction with content, and enhancement of human vision. The workshop will include a review of many of these technologies, along with a series of presentations on state-of-the-art research, ranging from visualization to optical modification to vision correction in the fields of MR/AR/VR. The workshop will also include a panel/discussion session in which participants can engage with leading experts in the field.
For these workshops, we expect researchers to submit early work, such as initial analyses of user studies, perceptual findings, experimental rendering techniques, or sketches for novel devices. Although position papers that comprise several pages and summarize a range of previous approaches (literature review), perceptual findings or experiences also fall inside the scope of the workshop. Papers should be between 2 and 6 pages in length and may cover one or more of the following topics:
- Novel display devices for VR and AR
- Rendering and information visualization methods that exploit perceptual issues
- Rendering and information visualization methods that target specific perceptual potentials
- Gaze-contingent rendering and interaction techniques
- Studies that provide insights into perception and cognition processes
- Saliency and attention models and findings
- Perception-driven image metrics
- Perceptual issues of image synthesis techniques
- Just noticeable differences, signal thresholds, and biases
- Validation methodologies, benchmarks and measurement methods, including eye tracking
- Novel measurement and processing techniques such as autorefractors or wavefront sensing
- Experimental designs and techniques for conducting user studies
- Systems, findings and general issues related to:
- Vergence-accommodation conflict
- Stereo disparity manipulation
- Wide Field-of-Views
- Vision assistance, correction, training, and enhancement-
- Head-mounted display technologies
- Oculography, eye tracking, and pupillometry
- Extended vision/ Superhuman vision
- Optics hardware and software
All submitted papers will go through a two-stage review process to guarantee the publication of high-quality papers. All accepted IEEE VR Workshops paper will be published electronically through the IEEE Digital Library.
Papers are to be submitted online through the Easychair system
Best wishes,Martin Weier, Kaan Aksit, Jason Orlosky, Yuta Itoh, Praneeth Chakravarthula and Chang Liu