Call for Papers: Eye Movement Data Processing and Analysis
avatar

Pawel Kasprowski sent the following call for papers, which I happily redistribute:

Dear Colleagues and Friends,
 
I would like to encourage you to submit papers concerning 
eye movement analysis to our 
 
*Eye Movement Data Processing and Analysis* 
http://www.emdpa.pl
 
session organized during the 
 
9th International KES Conference on Intelligent Decision Technologies
Vilamoura, Portugal, 21-23 June 2017
 
Please consider to contribute to and forward to your colleagues who might
be interested in submitting contribution to the above mentioned event.
 
The aim of the event is to summarize the current state of the art 
in the eye movement data analysis and enable prospective researchers 
to present their new ideas concerning this subject.
 
The scope of the EMDPA includes, but is not limited to:
 
- Collecting eye movement data
- Data precision and accuracy
- Calibration of eye movement data signal
- Events detection (fixations and saccades)
- Gaze-based user interfaces
- Eye movement modelling
- Eye movement signal visualization
- Data mining of eye movement signal
- Improving man machine interactions for people with disabilities
- Eye movement applications in testing interface usability
- Eye movement in security systems
- Usage of eye movement signal in cognitive processes
 
The conference proceedings will be published by Springer as book chapters 
in a volume of the KES Smart Innovation Systems and Technologies series, 
submitted for indexing in Scopus, Thomson-Reuters Conference Proceedings 
Citation Index (CPCI) and the Web of Science.
 
Important Dates:
Submission of papers:       15 January 2017
Notification of acceptance: 15 February 2017
Final versions:	            10 March 2017
Conference:                 21-23 June 2017
 
All other details: http://www.emdpa.pl
--
 Kind Regards,
 Pawel Kasprowski, PhD
 Institute of Informatics (www.inf.polsl.pl)
 Silesian University of Technology
 ul.Akademicka 16
 44-100 Gliwice, Poland
 tel. +48-32-237-13-39
 mob. +48-601-411-611

Call for Papers – ECCV Workshop on Assistive Computer Vision and Robotics (ACVR) 2016
avatar

********************************************************
CFP – Apologies for multiple copies
********************************************************
ECCV International Workshop on Assistive Computer Vision and Robotics (ACVR) 2016
Amsterdam, The Netherlands – One day among 8-9-16 October 2016
In conjunction with ECCV 2016

Web: http://iplab.dmi.unict.it/acvr2016/
Contact: ACVR.workshop@gmail.com
CFP: http://iplab.dmi.unict.it/acvr2016/ECCV2016-ACVR_CALL_for_PAPERS.pdf
********************************************************

_________________
IMPORTANT DATES
_________________

Full Paper Submission: June 20th 2016
Notification of Acceptance: July 20th 2016
Camera-Ready Paper Due: July 25th 2016

________________
CALL FOR PAPERS
________________

In the last decades there has been a tremendous increase in demand for assistive technologies useful to overcome functional limitations of individuals and to improve their quality of life. Novel tools have been successfully commercialised bringing the Computer Vision and Robotics research from theory to applications and then to market (e.g., OrCam, Jibo, etc).

Assistive technologies provide a set of advanced tools that can improve the quality of life not only for disabled, patients and elderly but also for healthy people struggling with everyday actions (e.g., for stress monitoring). After a period of slow but steady progress, this scientific area seems to be mature for new research and application breakthroughs.

The rapid progress in the development of integrated micro-mechatronic and computer vision tools has boosted this process. The interest in this applied field is increasing due to the possibility to exploit advanced technologies coming from the results for traditional problems in Computer Vision (such as face analysis, tracking, detection and recognition, human behaviors analysis). However, many problems remain still open especially as regards environment perception and interaction of these technological tools with people.

The main scope of ACVR 2016 is to bring together researchers from the diverse fields of engineering, computer science, social and bio-medical science who investigate in the context of Computer Vision and Robotics to discuss the current and next generation of Assistive Technologies.

The researchers will present their latest progress and discuss novel ideas in the field. Besides the technologies used, emphasis will be given to the precise problem definition, the available benchmark databases, the need of evaluation protocols and procedures in the context of Assistive Technologies.

Papers are solicited in, but not limited to, the following TOPICS:

* Augmented and Alternative Communication
* Human – Robot Interaction
* Mobility Aids
* Rehabilitation Aids
* Home Healthcare
* Technology for Cognition
* Automatic Emotional Hearing and Understanding
* Activity Monitoring Systems
* Manipulation Aids
* Scene Understanding
* Life-logging
* Visual Attention and Visual Saliency
* Smart Environments
* Safety and Security
* Ambient Assistive Living
* Quality of Life Technologies
* Navigation Systems
* Sensory Substitution
* Mobile and Wearable Systems
* Applications for the Visually Impaired
* Sign language recognition and applications for hearing impaired
* Applications for the Ageing Society
* Datasets and Evaluation Procedures
* Personalised Monitoring
* Video summarization
* Egocentric and First-Person Vision
* Applications to improve health and wellbeing of children and elderly
* Food Understanding

_________________
INVITED SPEAKERS
http://iplab.dmi.unict.it/acvr2016/?page=invited-speakers
_________________

Barbara Caputo, University of Rome La Sapienza, IT
Home Page: http://www.idiap.ch/~bcaputo/
Talk Title: Life long learning in computer and robot vision

Devi Parikh, Virginia Tech, US
Home Page: https://filebox.ece.vt.edu/~parikh/
Talk Title: Visual Question Answering

____________
SUBMISSION AND REVISION
____________

All submissions will be handled electronically via the conference’s CMT Website (link will be provided soon).

The format for paper submission is the same as the ECCV main conference. The paper length should match that intended for final publication. Papers are limited to 14 pages, including figures and tables. One additional page containing only cited references is allowed.

See template at the following link:
http://iplab.dmi.unict.it/acvr2016/index.php?page=submission

ACVR reviewing will be double-blind. Each submission will be reviewed by at least three reviewers for originality, significance, clarity, soundness, relevance and technical contents.

Papers that are not blind, or do not use the template, or have more than 14 pages (excluding references) will be rejected without review.

Submission Deadline: June 20th 2016

_____________________
JOURNAL SPECIAL ISSUE
_____________________

The authors of the accepted papers could be invited to submit an extended version (at least 40+% different than the workshop versions) of their papers to a Journal Special Issue (with open call and peer review). More information will be soon available.

_________________
WORKSHOP CHAIRS
_________________

Giovanni Maria Farinella, University of Catania, IT

Marco Leo, CNR-Institute of Applied Sciences and Intelligent Systems, IT

Gerard G. Medioni, University of Southern California, US

Mohan Trivedi, University of California San Diego, US

_________________
ENDORSERS
_________________

GIRPR – Gruppo Italiano Ricercatori in Pattern Recognition
http://girpr.tk/

ORCAM
http://www.orcam.com/

STMicroelectronics
http://www.st.com/

Toshiba Research Europe Ltd
http://www.toshiba.eu/eu/Toshiba-Research-Europe/

_________
CONTACTS
_________

Email: ACVR.workshop@gmail.com
Website: http://iplab.dmi.unict.it/acvr2016

Call for papers for the workshop “Inferring user action with mobile gaze tracking”
avatar

Call for papers for the workshop “Inferring user action with mobile gaze tracking”

Scope of the workshop

Gaze tracking in psychological, cognitive, and user interaction studies has recently evolved toward mobile solutions, as they make it possible to directly assess users’ visual attention in natural environments. In addition to attention, gaze can provide information about users’ action and intention: what the user is doing and what will she do next. Gaze behavior in natural, unstructured task environments is quite complex and cannot be satisfactorily explained using “simple” behavioral models that are induced with typical stimulus-response testing in controlled laboratory environments. To evolve the inference of gaze-action behavior in natural environments, a more holistic approach bringing together a number of disciplines from cognitive sciences to machine learning is needed.

The goal of the workshop is to bring together a cross-domain group of individuals to (i) discuss and contribute to the problem of using mobile gaze tracking for inferring user action, (ii) advance the sharing of data and analysis algorithms as well as device solutions, and (iii) increase understanding of behavioral aspects of gaze-action sequences. The workshop proposes an interdisciplinary gathering for recognizing potential synergies, mapping solved and unsolved problems, and creating a research roadmap for the future applying the strengths of each contributing field.

Topic of interest

We seek papers related, but not limited, to the following topics:

  • Cognitive aspects of mobile gaze tracking
  • Computational methods to infer user action from gaze data
  • Technical solutions for mobile gaze tracking
  • Applications and potential use of gaze data

Important dates

  • Submission Deadline: May 27th 2016, 5pm PST
  • Notifications Sent: June 10th 2016
  • Camera-ready submissions: June 29th 2016
  • Workshop day: September 6th 2016

Other information

The workshop wil be held in conjunction with the MobileHCI 2016 conference (http://mobilehci.acm.org/), to be held in Florence (Tuscany), Italy, September 6th – 9th, 2016. The instructions for submitting the papers can be found at http://www.ttl.fi/gaze2016. The papers will be peer-reviewed by at least two independent reviewers. At least one author of accepted papers needs to register for the workshop and for the conference itself. Accepted workshop papers will be included in the MobileHCI 2016 Adjunct Proceedings.

Contact: miika.toivanen@ttl.fi​; kristian.lukander@ttl.fi

Abstract deadline for Scandinavian Workshop on Applied Eye Tracking approaching
avatar

Raymond Bertram distributed the following message over the eyemovement mailing list:

The abstract submission deadline of SWAET 2016 (Scandinavian
Workshop on Applied Eye Tracking) to be kept in Turku from 19-21.6.2016
is approaching. Abstract submissions should be submitted by Friday 26.2.2016.
Abstracts can be submitted via the abstract submission page,
see http://swaet2016.utu.fi/abstract.html

SWAET is an interdisciplinary meeting place for graduate students, researchers,
industry, and other people using eye tracking as a measurement tool.
The program includes 2 keynote lectures, several sessions with 3-4 talks
of 20 minutes each, and 1 or 2 poster sessions. Several eye tracker
companies will present their latest products. The keynote speakers
are Professor Simon Liversedge from the University of Southampton, UK
(Reading Comprehension), and Dr. Halszka Jarodzka from the University
of Heerlen, The Netherlands (Learning & Instruction).

The welcome reception will be held on Sunday evening the 19th of June
and the conference dinner on Monday the 20th of June.

The participation fee will be in the range of 60 to 100 € for regular
participants and free of charge (0€) for graduate students.

For more information, see http://swaet2016.utu.fi/
Contact infomation: SWAET2016@utu.fi

Important dates:
Abstract submission open: January 18, 2016
Abstract deadline: February 26, 2016
Notification of acceptance: March 18, 2016
Registration deadline: May 1, 2016
Conference: June 19-21, 2016

On behalf of the organizing committee, welcome to Turku!

Raymond Bertram

Organizing committee:
Raymond Bertram (Chair); Fred Andersson; Johanna Kaakinen; Tuomo Häikiö; Henri Olkoniemi; Seppo Vainio; Jukka Hyönä; Marjaana Puurtinen; Aki Kyröläinen; Suvi Holm; Kenneth Holmqvist

Call for Papers: Eye Tracking South Africa 2016
avatar

Tanya Beelders just posted the new call for the ETSA 2016:

Eye Tracking South Africa, an international conference aimed specifically at eye tracking research will be held in Stellenbosch (near Cape Town), South Africa. The conference will take place from 5-7 October 2016.

Call for papers

The conference will include poster sessions, presentations of short papers (3-4 pages) and full papers (8-10 pages). It will be organised around several tracks and sessions to accommodate delegates with various interests. Besides the academic presentations, we also welcome industry to present their products and services in non-academic workshops and demonstrations. In this context “industry” does not refer to manufacturers of eye trackers only but also to users thereof, for example market researchers, usability analysts, graphic designers, educators, radiographers, occupational and speech therapists, people with physical disabilities, cognitive psychologists, neurologists, ophthalmologists, etc. We therefore invite the submission of case studies and proposals in these areas.

ETSA 2016 has been approved for in-cooperation with SIGCHI. The proceedings will be published in the ACM digital library and will have an ISBN number.

Authors are invited to submit original research papers related to eye movement and the application thereof. Conference tracks will include but are not limited to the following topics:

· Usability

· Visualisation

· Gaze interaction

· Reading research

· Eye Control for people with disabilities

· Visual attention

· Systems, tools and methods

· Eye movements

· Technical aspects of eye tracking e.g. pupil detection, calibration, mapping, event detection, data quality, etc.

Full and short papers

Full papers should be 8-10 A4 pages while short papers should be 3-4 A4 pages in length. All papers should follow the ACM Proceedings format (templates will shortly be available on the ETSA website). Submissions should be in pdf format with all fonts embedded in the document. All tables and figures should be included in the paper.

Poster presentations

Poster presentations will be evaluated on a 300-400 word abstract. Upon acceptance, long papers will be allowed 30 minutes for presentation and discussion while short papers will be limited to 20 minutes. Posters should be in A0 format for presentation and A4 format for inclusion in the proceedings.

SUBMISSION INSTRUCTIONS AND REVIEW PROCESS

The submitted paper must be anonymous, i.e. papers should not include any information that can identify the authors. Authors’ names should be removed from the submission, with “Author” and year used in the bibliography and footnotes, instead of authors’ name, paper title, etc. In addition, all identifying information in file properties should be removed.

Submission instructions will be available shortly.

Each paper will be anonymously reviewed by at least three reviewers. The program committee will consider the following criteria when evaluating submitted papers: originality of contribution, relevance to the conference, technical/scientific merit, and presentation and clarity.

SUBMISSIONS: SPECIAL INTEREST / CASE STUDY SESSIONS FOR PRACTITIONERS AND INDUSTRY

ETSA 2016 invites practitioners from any related discipline (Occupational Therapy, Psychology, Speech Therapy, Ophthalmology etc.) to present case study based sessions on their use of eye tracking with patients/clients. These need not represent formal research, and should be submitted in the format of a short paper under the Special Interest category. Submissions should describe the case study and the findings which will be presented.

Additionally, industry is invited to submit in this category for consideration of non-academic workshops or demonstrations. These submissions should be in the form of a proposal for the envisaged workshop, detailing the contents of the workshop or demonstration and how it will be presented.

Deadline for full papers: 3 July 2016.

Deadline for short papers and posters (abstracts): 3 July 2016

Deadline for case studies: 3 July 2016

Please visit http://www.eyetrackingsa.com<http://www.eyetrackingsa.com/> for more information.

ETSA Organising Committee

Tanya Beelders

Pieter Blignaut

University of the Free State

Department of Computer Science and Informatics?

_____________________________________________________________________

University of the Free State: This message and its contents are subject to a disclaimer.

Please refer to http://www.ufs.ac.za/disclaimer for full details.

Universiteit van die Vrystaat:

Hierdie boodskap en sy inhoud is aan ‘n vrywaringsklousule onderhewig.

Volledige besonderhede is by http://www.ufs.ac.za/vrywaring beskikbaar.

_____________________________________________________________________


EYE-MOVEMENT mailing list (eye-movement@jiscmail.ac.uk)
N.B. Replies are sent to the list, not the sender
To unsubscribe, etc. see http://www.jiscmail.ac.uk/files/eye-movement/introduction.html
Other queries to list owner at eye-movement-request@jiscmail.ac.uk

Research Assistant Position: Gaze-Tracking Project at QMUL
avatar

Isabelle Mareschal posted a job offer on CVNET which might be of interest to you:

Research Assistant position for gaze-tracking project at QMUL

We are recruiting a Research Assistant for the four-month project 
'Evaluation of commodity gaze-trackers for large-scale 
neuropsychological studies'. The objective is to characterize the 
performance of new gaze-tracking systems (e.g. EyeTribe), for 
clinical and psychophysical applications. This will involve 
user experiments, as well as computational modelling.

This is a joint project, supervised by Miles Hansard (EECS) 
and Isabelle Mareschal (Experimental Psychology), and funded 
by a Wellcome Trust grant to the QMUL Life Sciences Institute. 
The successful candidate will join a research group working on 
3D vision, eye-movements, and scene-modelling. There will be 
opportunities to work with a range of new devices, including 
head-mounted displays (Oculus Rift).

Candidates should have previous experience of Matlab and/or 
JavaScript programming. Knowledge of signal processing and 
data analysis is also essential. Previous experience of 
eye-tracking, web-based experiments, or computer vision would be useful.

Candidates should have a PhD in computer science, engineering, 
experimental psychology, or a related field. Candidates in the 
final stages of PhD submission (or having an MSc plus relevant 
experience) will also be considered.

The post is full-time and available immediately for 4 months, 
to start no later than 01/04/2016. The salary will be in the 
range £32,052-£35,672 per annum. Applicants *must already* have 
permission to work in the UK, for the duration of the post.

Applications should consist of a CV (including publications, 
and the email addresses of two referees), as well as a one-page 
research statement (including details of programming experience). 
These PDFs should be emailed, with subject 'Gaze RA Application', 
tomiles.hansard@qmul.ac.uk before the deadline of 14/02/2016. 

Enquiries should be addressed to miles.hansard@qmul.ac.uk
For further information, please see:
http://www.eecs.qmul.ac.uk/~milesh/
http://isabelle-mareschal.squarespace.com/
http://www.eecs.qmul.ac.uk/

Call for Participation: SAGA Workshop 2015 – Early Bird ends August 30th
avatar

Call for Participation at SAGA Workshop 2015 – Early Bird ends August 30th

2nd International Workshop on Vision and Eye Tracking in Natural Environments and the Technical Implementation of Suitable Analysis Methodologies (SAGA 2015)

Where: CITEC Research Building, Bielefeld University, Germany
When: September, 29-30, 2015
WWW: http://saga.eyemovementresearch.com

As a follow up event of the successfull SAGA 2013 workshop,
we are providing a forum for researchers to discuss techniques and
applications that go beyond classical eye tracking and stationary
eye-based interaction.


Keynote speakers:

  • Jacob Lund Orquin (Denmark): “What eye tracking researchers (dis)agree about reporting”
  • Maria Staudte (Germany): “Studying gaze in spoken interaction”
  • Mark Williams (United Kingdom): “Visual search behaviour and expertise in high-performance environments”

The full program with all the talks can be found online:


Registration:


 

SAGA 2015 Workshop Organising Committee:

Workshop Organisers:
Thies Pfeiffer, Kai Essig, Pia Knoeferle, Helge Ritter, and Thomas Schack. All from Bielefeld University, Germany
Scientific Board: Thomas Schack, Helge Ritter and Pia Knoeferle

Please visit the website periodically for updates(http://saga.eyemovementresearch.com/about-saga/) For additional question, please contact: saga@eyemovementresearch.com

We look forward to welcoming you to Bielefeld in September, 2015!

Thies Pfeiffer, Kai Essig & Pia Knoeferle

PhD-course: Eye-tracking in social science research projects
avatar

The qualifications and skills obtained during master programs often hardly prepare students to conduct eye-tracking studies, to avoid potential pitfalls when using the eye-tracking equipment and to analyze the complex eye-tracking datasets. Especially in the beginning of a PhD project these challenges appear to be overwhelming. PhD students completing the course will gain an overview of research in the field of bottom-up and top-down attentional process and search in decision-making. We will give an overview on latest developments in the field, including learning and contextual biases in decision sequences and the evaluation of decision theories. From a practical perspective PhD students will get insight in the process of setting up eye-tracking experiments, conducting a first empirical study on their own and analyzing an eye-tracking dataset. PhD students will have the opportunity to use remote eye-tracking devices together with their own laptops and use the provided software to analyze their datasets. Based on this experience, students will be able to critically reflect their experimental work and improve the planning of their own future experiments. Moreover, PhD students will learn about ways of analyzing eye-tracking data, for example using multi-level regression models.

*More information can be found at: http://tinyurl.com/phdcourse-eyetracking*

First Call for Papers: 5th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2015)
avatar

Where: September 7, 2015 in Osaka, Japan

in conjunction with the
2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015)

Eye tracking technology is becoming increasingly available for mobile and pervasive settings. The availability of eye tracking beyond the desktop calls for new interaction concepts, novel applications, and an understanding of the broader implications of pervasive eye tracking on humans. PETMEI 2015 focuses on pervasive eye tracking as a trailblazer for mobile eye-based interaction. The goal of the workshop is to bring together members in the ubiquitous computing, context-aware computing, computer vision, machine learning and eye tracking community to exchange ideas and to discuss different techniques and applications for pervasive eye tracking.

PETMEI 2015 will be a one-day workshop featuring presentations, interactive demos, and group discussions. We solicit papers describing original research related to, or visionary of, pervasive eye tracking research addressing computational methods, new applications and use cases, as well as technology for pervasive eye tracking and mobile eye-based interaction.

Topics of interest include, but are not limited to:

Methods
– Tools for face, eye, and pupil detection as well as tracking
– Devices for wearable and ambient eye tracking
– Eye tracking technologies on mobile devices
– Head-mounted or remote gaze estimation
– Gaze and eye movement analysis methods
– Fusion of gaze with other modalities
– Integration of pervasive eye tracking and context-aware computing
– User studies on pervasive eye tracking

Applications
– Pervasive eye-based interaction
– Mobile attentive user interfaces
– Eye-based activity and context recognition
– Security and privacy for pervasive eye-tracking systems
– Eye tracking for specialized application areas
– Cognition-aware systems and user interfaces
– Human factors in mobile eye-based interaction
– Eye tracking for pervasive displays
– Gaze-based interaction with outdoor spaces

Submission Guidelines

We accept submissions with a length of between 6 and 10 pages in the SIGCHI Extended Abstract format. Refer to the workshop website for Word and Latex templates (http://2015.petmei.org/submissions/). In addition to research papers we explicitly invite submissions of position papers and papers that describe work-in-progress. Submissions will be peer-viewed by at least two members of the technical program committee with respect to novelty, significance, technical quality, and their potential to spark interesting discussions. Please note that all submissions must be anonymized for double-blind review.

Accepted papers will be published in the UbiComp 2015 supplemental proceedings and in the ACM Digital Library. At least one author for each accepted paper is required to attend the workshop and present the paper.

Submit your paper via EasyChair: https://easychair.org/conferences/?conf=petmei2015
Important Dates
– June 5, 2015 Paper submission
– July 3, 2015 Notification of acceptance
– July 10, 2015 Camera-ready due
– September 7, 2015 Workshop

Organizers
– Peter Kiefer, ETH Zürich, Switzerland
– Yanxia Zhang, Lancaster University, U.K.
– Andreas Bulling, Max Planck Institute for Informatics, Germany

Contact petmei2015@gmail.com

PhD Position: Gaze-Based Interaction with Urban Environments
avatar

At the ETH Zürich, the Institute of Cartography and Geoinformation is looking for a highly motivated PhD candidate for a research project at the Chair of Geoinformation Engineering, starting at the earliest possible date (1st of August, 2015 at the latest).

The main objective of the 3-year project Location-Aware Mobile Eye Tracking for Tourist Assistance, funded through an ETH Zurich Research Grant, is the investigation of novel gaze-based interaction methods for pedestrians in urban environments, with a focus on a tourist scenario. The project envisions mobile assistance systems that trigger information services based on a user’s gaze on 3D-objects in a real-world urban environment.

The ideal candidate must have an academic degree in Computer Science, Information Science, Geomatics or related fields, as well as a strong research interest in human-computer interaction. Knowledge of methods in machine learning, computer vision, inferential statistics, as well as strong programming skills (e.g., Java or C++) are required. Background or experience in eye tracking, virtual environments, geographic information science, or related topics is a significant plus. The candidate must have good communication skills in English (oral and writing), be team-oriented and willing to work in an international environment.

If you want to apply, please visit the website: https://pub.refline.ch/845721/3790/++publications++/1/index.html