Call for Papers: European Conference on Eye Movements 2015
avatar

Florian Goller posted this call for papers:

European Conference on Eye Movements 2015

Call for Abstracts

We are very pleased to announce this call for abstracts for the 18th European Conference on Eye Movements (ECEM). As in previous years, we look forward to an exciting conference. As the largest conference on eye movements worldwide, ECEM aims to promote cooperation and communication between researchers and research fields, as well as to exchange information on the state-of-the-art of research, equipment, and software in the field of eye movements. Therefore, ECEM brings together basic as well as more applied researchers from different fields including neurophysiologists, psychologists, neuropsychologists, clinicians, linguists, computational and applied scientists, engineers and manufacturers that are interested in eye movements.

ECEM 2015 will be held from August 16 to 21, 2015 in Vienna, Austria. The conference chairs are Ulrich Ansorge, Thomas Ditye, Arnd Florack, and Helmut Leder from the Faculty of Psychology of the University of Vienna.

This year’s guest lectures will be given by Laurent Itti (University of Southern California, Los Angeles), Jukka Hyönä (University of Turku, Finland), Peter König (University of Osnabrück, Germany), Tirin Moore (Stanford University, CA), John K. Tsotsos (York University, Canada) and Robin Walker (Royal Holloway, UK).

ECEM 2015 offers the possibility to submit *symposia*, as well as individual *talks* and *posters*. Please read all instructions to ensure that your contribution can be considered for ECEM 2015.

The call for abstracts can also be downloaded here: http://goo.gl/njN1Nq

For further information and for the submission of your abstracts please visit the conference website: http://ecem2015.univie.ac.at

Please feel free to distribute and forward this call to your colleagues and/or members of your institution/mailing-list.

We look forward to welcoming you to Vienna!

Contact

Please send any queries, open questions or organisational concerns to: ecem2015@univie.ac.at

Call for Papers: 7th International KES Conference on INTELLIGENT DECISION TECHNOLOGIE
avatar


Call for Papers

7th International KES Conference on INTELLIGENT DECISION TECHNOLOGIES

Special Session
*Intelligent Methods for Eye Movement Data Processing and Analysis*
http://www.kasprowski.pl/emdpa

Sorrento, Bay of Naples, Italy 17-19 June 2015


The aim of the session is to summarize the current state of the art in the eye movement data analysis and enable prospective researchers to present their new ideas concerning this subject.

The scope of the session includes, but is not limited to:

– Collecting eye movement data
– Data precision and accuracy
– Calibration of eye movement data signal
– Events detection (fixations and saccades)
– Gaze-based user interfaces
– Eye movement modelling
– Data mining of eye movement signal
– Eye movement based identification
– Improving man machine interactions for people with disabilities
– Eye movement applications in testing interface usability
– Eye movement in security systems
– Eye movement in solving problems
– Usage of eye movement signal in cognitive processes
– Methods improving quality of eye movement signal
– Recognition of people’s intentions based on their eye movement

The conference proceedings will be published by Springer as book chapters in a volume of the KES Smart Innovation Systems and Technologies series, submitted for indexing in Scopus and Thomson-Reuters Conference Proceedings Citation Index (CPCI) and the Web of Science.

Important Dates:
Submission of papers: 12 January 2015
Notification of acceptance: 16 February 2015
Final versions: 2 March 2015
Conference: 17-19 June 2015

Email & Contact Details:
Paweł Kasprowski (pawel.kasprowski@polsl.pl)
Katarzyna Harężlak (katarzyna.harezlak@polsl.pl)
Institute of Informatics, Silesian University of Technology, Poland

Please consider to contribute to and/or forward to your colleagues who might be interested in submitting contribution to the above mentioned event.

2nd Call for Papers: 4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
avatar

Call for Papers
===============

4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
– in conjunction with UbiComp 2014

You are cordially invited to submit original work at the PETMEI 2014 Workshop. The workshop will be held in Seattle on September 13th, 2014.

Location: Seattle, United States
Date: September 13th, 2014

IMPORTANT DATES
– Abstract Submission: June 3, 2014
– Paper Submission: June 10, 2014
– Notification of Acceptance: June 24, 2014
– Camera-ready due: July 1, 2014
– Workshop: September 13, 2014

VISION AND GOALS
Despite considerable advances over the last decades, previous work on eye tracking and eye-based human-computer interfaces mainly developed use of the eyes in traditional desktop settings. Latest developments in remote and headmounted eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that will become pervasively usable in everyday life. With the growth of interest in smart glass devices and low-cost eye trackers, gaze-based techniques for mobile computing is becoming increasingly important in recent years. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7.

The potential applications for the ability to track and analyse eye movements anywhere and any time call for new research to further develop and understand visual behaviour and eyebased interaction in daily life settings. PETMEI 2014 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, egocentric computer vision and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking.

TOPICS
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:

Methods
We invite participants to reflect on the specific characteristics of pervasive eye tracking systems and to contrast them with classical methods for eye tracking, eye movement analysis, eye-based interaction, and evaluation. We welcome contributions reporting on methodological advances on all components of mobile eye tracking systems, and the workshop will also cover latest technological advances in mobile eye tracking equipment.
– Eye tracking technologies for mobile devices
– Tools for face, eye detection and tracking
– Gaze and eye movement analysis methods
– Integration of pervasive eye tracking and context-aware computing
– Multi-modal sensor fusion
– User studies on pervasive eye tracking
– Devices for portable, wearable and ambient eye tracking

Applications
In addition to contributions reporting on methodological advances we also want to attract submissions that explore innovative applications of pervasive eye tracking and mobile eye-based interaction. We also want to particularly invite presentations on research on egocentric vision systems and gazerelated computer vision applications that can potentially extend the possibility of current mobile gaze interaction.
– Pervasive eye-based interaction
– Mobile attentive user interfaces
– Eye-based activity and context recognition
– Security and privacy for pervasive eye-tracking systems
– Eye tracking for specialized application areas
– Eye-based human-robot and human-agent interaction
– Cognition-aware systems and user interfaces
– Human factors in mobile eye-based interaction
– Egocentric computer-vision systems and applications

SUBMISSION GUIDELINES
Prospective authors should submit papers with a length of 6-12 pages in the SIGCHI non-archival (Extended Abstracts) format. In addition to research papers we explicitly invite submissions of position papers and papers that describe preliminary results or work-in-progress. Manuscripts will be reviewed by at least two reviewers. Accepted papers will be published in the UbiComp 2014 adjunct proceedings.

Templates
The format for submissions has changed, please use the SIGCHI Extended Abstracts template:
– Latex http://www.sigchi.org/publications/chipubform/sigchi-extended-abstracts-latex-template/view
– Word http://www.sigchi.org/publications/chipubform/sigchi-extended-abstracts-word-template/view

Submission Website
Please visit our website http://2014.petmei.org/submissions/ for regular updates.

ORGANIZERS
– Thies Pfeiffer, Center of Excellence Cognitive Interaction Technology, Bielefeld University, Germany
– Sophie Stellmach, Microsoft Corporation, USA
– Yusuke Sugano, The University of Tokyo, JP

PROGRAM COMMITTEE
PETMEI 2014 is supported by the following program committee members:
– Andreas Bulling, Max Planck Institute for Informatics, DE
– Andrew T. Duchowski, Clemson University, USA
– Alireza Fathi, Stanford University, USA
– Dan Witzner Hansen, IT University of Copenhagen, DK
– Kris M. Kitani, Carnegie Mellon University, USA
– Päivi Majaranta, University of Tampere, FI
– Lucas Paletta, Joanneum, AT
– Pernilla Qvarfordt, FX Palo Alto Laboratory, US
– Lech Swirski, University of Cambridge, UK
– Takumi Toyama, DFKI, DE

CONTACT AND FURTHER INFORMATION
For further information, please visit our website or sent us an email:
– Official PETMEI 2014 E-Mail: petmei2014@gmail.com
– PETMEI 2014 Website: http://2014.petmei.org/

Best regards
Thies, Sophie and Yusuke

Call for Application: European Summer School on Eye Movements (ESSEM) 2014
avatar

This call came in over the EM_LIST:

This is the first call for the European Summer School on Eye Movements (ESSEM), funded by the Boehringer-Ingelheim-Stiftung and organised by Christoph Klein (Freiburg/Bangor) and Ulrich Ettinger (Bonn). ESSEM 2014 will be held at the Department of Child and Adolescent Psychiatry of the University of Freiburg, Germany, from 8th to 13th September 2014.

ESSEM brings together internationally renowned researchers to teach students in the theories, neural bases, experimental designs and statistical analysis of eye movements studies in basic science as well as applied settings. Please see below for a preliminary list of speakers.

We are pleased to now invite applications from students and researchers at the PhD/MD or early post-doctoral level. Financial support is available for participants’ travel and accommodation expenses. The course fee is €250 and includes catering (lunch, coffee) throughout the summer school.

Participants are expected to present a poster of a study they have carried out (not necessarily involving oculographic methods) on the first day of ESSEM.

To apply, please send a cover letter with brief personal statement on why you wish to attend ESSEM, your current CV and the abstract for your poster presentation to essem@uni-bonn.de. The deadline for applications is 31st May 2014.

For further information please visit http://www.psychologie.uni-bonn.de/units/cognitive-psychology?set_language=en or contact essem@uni-bonn.de

With best wishes,

Christoph Klein & Ulrich Ettinger

Preliminary List of Speakers, ESSEM 2014:

Giuseppe Boccignone (Italy)
Ulrich Ettinger (Germany)
Tom Foulsham (UK)
Mark Greenlee (Germany)
Sam Hutton (UK)
Yukko Hyönä (Finland)
Alan Kingston (Canada)
Christoph Klein (Germany/UK)
Bruno Laeng (Norway)
Rebekka Lencer (Germany)
Susana Martinez-Conde (USA)
René Müri (Switzerland)
Nadine Petrovksy (Germany)
Pierre Pouget (France)
Nikolaos Smyrnis (Greece)
Werner Sommer (Germany)

Call for Papers: 4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
avatar

Call for Papers
===============

4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
– in conjunction with UbiComp 2014

You are cordially invited to submit original work at the PETMEI 2014 Workshop. The workshop will be held in Seattle on September 13th, 2014.

Location: Seattle, United States
Date: September 13th, 2014

IMPORTANT DATES
– Abstract Submission: June 3, 2014
– Paper Submission: June 10, 2014
– Notification of Acceptance: June 24, 2014
– Camera-ready due: July 1, 2014
– Workshop: September 13, 2014

VISION AND GOALS
Despite considerable advances over the last decades, previous work on eye tracking and eye-based human-computer interfaces mainly developed use of the eyes in traditional desktop settings. Latest developments in remote and headmounted eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that will become pervasively usable in everyday life. With the growth of interest in smart glass devices and low-cost eye trackers, gaze-based techniques for mobile computing is becoming increasingly important in recent years. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7.

The potential applications for the ability to track and analyse eye movements anywhere and any time call for new research to further develop and understand visual behaviour and eyebased interaction in daily life settings. PETMEI 2014 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, egocentric computer vision and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking.

TOPICS
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:

Methods
We invite participants to reflect on the specific characteristics of pervasive eye tracking systems and to contrast them with classical methods for eye tracking, eye movement analysis, eye-based interaction, and evaluation. We welcome contributions reporting on methodological advances on all components of mobile eye tracking systems, and the workshop will also cover latest technological advances in mobile eye tracking equipment.
– Eye tracking technologies for mobile devices
– Tools for face, eye detection and tracking
– Gaze and eye movement analysis methods
– Integration of pervasive eye tracking and context-aware computing
– Multi-modal sensor fusion
– User studies on pervasive eye tracking
– Devices for portable, wearable and ambient eye tracking

Applications
In addition to contributions reporting on methodological advances we also want to attract submissions that explore innovative applications of pervasive eye tracking and mobile eye-based interaction. We also want to particularly invite presentations on research on egocentric vision systems and gazerelated computer vision applications that can potentially extend the possibility of current mobile gaze interaction.
– Pervasive eye-based interaction
– Mobile attentive user interfaces
– Eye-based activity and context recognition
– Security and privacy for pervasive eye-tracking systems
– Eye tracking for specialized application areas
– Eye-based human-robot and human-agent interaction
– Cognition-aware systems and user interfaces
– Human factors in mobile eye-based interaction
– Egocentric computer-vision systems and applications

SUBMISSION GUIDELINES
Prospective authors should submit notes with a maximum length of four pages or full papers with a maximum length of six pages. In addition to research papers we explicitly invite submissions of position papers and papers that describe preliminary results or work-in-progress. All submissions should be prepared according to the SIGCHI archival format (double column, PDF). Manuscripts will be reviewed by at least two reviewers. Accepted papers will be published in the UbiComp 2014 supplemental proceedings. In addition, printed proceedings will be distributed to the participants during the workshop. We also plan to publish extended versions of selected papers in an edited book or a special issue of a journal or magazine.

Templates
– Latex http://www.sigchi.org/publications/chipubform/sigchi-papers-latex-template/at_download/file
– Word http://www.sigchi.org/publications/chipubform/sigchi-papers-word-template/at_download/file

Submission Website
Please visit our website http://2014.petmei.org/ for regular updates.

ORGANIZERS
– Thies Pfeiffer, Center of Excellence Cognitive Interaction Technology, Bielefeld University, Germany
– Sophie Stellmach, Microsoft Corporation, USA
– Yusuke Sugano, The University of Tokyo, JP

PROGRAM COMMITTEE
PETMEI 2014 is supported by the following program committee members:
– Andreas Bulling, Max Planck Institute for Informatics, DE
– Andrew T. Duchowski, Clemson University, USA
– Alireza Fathi, Stanford University, USA
– Dan Witzner Hansen, IT University of Copenhagen, DK
– Kris M. Kitani, Carnegie Mellon University, USA
– Päivi Majaranta, University of Tampere, FI
– Lucas Paletta, Joanneum, AT
– Pernilla Qvarfordt, FX Palo Alto Laboratory, US
– Lech Swirski, University of Cambridge, UK
– Takumi Toyama, DFKI, DE

More committee members to be announced soon.

CONTACT AND FURTHER INFORMATION
For further information, please visit our website or sent us an email:
– Official PETMEI 2014 E-Mail: petmei2014@gmail.com
– PETMEI 2014 Website: http://2014.petmei.org/

Best regards
Thies, Sophie and Yusuke

Call for Participation at SAGA Workshop 2013
avatar

Call for Participation at SAGA Workshop 2013

1st International Workshop on Solutions for Automatic Gaze Data Analysis and Eyetracking Studies in Natural Environments

Where: Bielefeld University, Germany
When: October 24th – 25th, 2013

Why you would want to participate:

If you are an experimental researcher:
– Does manual annotation impede your research?
– Do you want to analyse mobile eye tracking data?
– Do you want to move from desktop-based to more
natural interaction scenarios?

If you are a computer scientist:
– Are you interested in gaze-based interaction?
– Are you an expert in tracking objects or reconstructing scenes?
– Are you seeking for an interesting field of application?

Then you should not miss the SAGA Workshop on 24th to 25th of October 2013 at the new CITEC facilities at Bielefeld University (Germany):

The aim of the workshop is to build a bridge between basic academic research and applied research, particularly in the fields of visual image analysis, and scene representation (object recognition and tracking), as well a the online analysis and interpretation of attention in context mobile studies on natural scene perception.

We are providing a forum for researchers from human-computer interaction, context-aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos and use of eye tracking in natural environment studies.

Talks:

Several researchers will present their works on solutions for the (semi-) automatic annotation of gaze videos and on eye movement studies in natural environments as a trailblazer for gaze analysis in natural environments, mobile eye-based interaction and eye-based context-awareness.

Keynote speakers:

Several outstanding keynote speakers from academics and industry in the fields of eye movement research in natural environments, leading usability service provider and assistive technologies in professional training processes for disabled people, already confirmed their participation at the SAGA 2013 workshop:

– Marc Pomplun, UMASS Boston, United States of America
– Ben Tatler, University of Dundee, Scotland
– Michael Schiessl, Managing Director of EyeSquare
(User & Brand Research) in Berlin, Germany
– Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld,
Germany

Live Demo Sessions:

As a particular highlight, several technical solutions (commercial or research-in-progress) for automatic gaze analysis will be demonstrated inbetween sessions:
– VideoGazer – A Modular Approach Towards Automatic Annotation
of Gaze Videos
– Location-based Online Identification of Objects in the
Centre of Visual Attention using Eye Tracking
– Various object recognition and tracking solutions from the
Robotics Group of the Center of Excellence “Cognitive Interaction
Technology”, Bielefeld University, Germany
– BeGaze from Sensomotoric Instruments (SMI)
– Mobile Eye Tracking paired with a mobile EEG solution
– … and more to come

During the live session, researchers will provide an in-depth demonstration of their solutions and they are pleased to answer any further questions you may have.

For more information and registration, please visit the workshop website at http://saga.eyemovementresearch.com/

We are looking forward to meet you at SAGA!

Kai Essig & Thies Pfeiffer

2nd Call for Papers: 1st International Workshop on Solutions for Automatic Gaze Data Analysis (CITEC/Bielefeld University)
avatar

SAGA 2013: 1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS –
uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Workshop Website: http://saga.eyemovementresearch.com/

The SAGA 2013 workshop is accepting abstracts for two calls: Challenge Contributions as well as Oral Presentation or Posters.
We are currently pursuing possible options for publication of a special issue in a journal or as an edited volume.


Important Dates:

1. Oral presentation / poster call:

August, 23, 2013: Deadline for abstract submissions.
September, 6, 2013: Notification of acceptance for talks and posters.

2. Challenge:

August, 23, 2013: Deadline for 2-page preliminary abstract sketching your approach.
September, 6, 2013: Notification of acceptance for challenge.
October, 2, 2013: Submission of the final abstracts and final results.

October, 24-26, 2013: Workshop takes place at Bielefeld University, Germany.


Invited keynote speakers from academia and industry:

Marc Pomplun, UMASS Boston, United States of America
Ben Tatler, University of Dundee, Scotland
Michael Schiessl, Managing Director of EyeSquare (User & Brand Research) in Berlin, Germany
Andreas Enslin, Head of Miele Design Centre in Gütersloh, Germany
Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld, Germany


We are very pleased to publish this second call for challenge contributions and abstracts for SAGA 2013, the 1st International Workshop on Automatic Annotation of Gaze Videos. SAGA 2013 will focus on automatic solutions for the automatic annotation of gaze videos and research work on eye movement analysis in natural environments as a trailblazer for mobile eye-based interaction and eye-based context-awareness.

We are providing a forum for researchers from human-computer interaction, context-aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction.

We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos.


Submissions:

Abstracts will be peer-viewed by at least two members of an international program committee. Word and LaTex templates for the submissions are now available on the workshop website and the registration is open.

########################################
1. Oral presentation / poster call: #
########################################

We are calling for 500-word abstracts on topics related to real-world eye tracking and eye movement analyses. Possible topics include, but are not limited to, eye tracking in human-machine interaction, visual search, language processing, eye-hand coordination, marketing, automatized tasks, and decision making.

Please note: All accepted contributions must register for the workshop.

################
2. Challenge: #
################

In order to drive research on software solutions for the automatic annotation of gaze videos we offer a special challenge on this topic. The purpose of the challenge is to encourage the community to work on a set of specific software solutions and research questions and to continuously improve on earlier results obtained for these problems over the years.

We are providing a set of test videos on the workshop website (duration 2-3 minutes) and separate text files with the corresponding gaze data for which solutions for semi- and fully-automatic software solutions for the recognition and tracking of objects over the whole video sequence shall be written. The software should provide the coordinates for the tracked objects and use this information to automatically calculate object specific gaze data, such as number of fixations and cumulative fixation durations. There are no restrictions on the way in which the relevant objects are marked and on which kind of techniques can be used to track the objects.

The only constraint is that your software solution can read and process the provided videos and reports gaze specific data for the selected objects either as a text file (which can serve as input for a statistical program such as SPSS, Matlab, R oder MS Excel) or by providing some kind of visualization.


Detailed information on the participation and a description of all necessary steps can be found on the workshop website (see: http://saga.eyemovementresearch.com/challenge/howto-participate-in-the-challenge/). Additionally, you can now find an explanation of the manual annotation procedures which we will be used to evaluate the submitted software solutions for the challenge(http://saga.eyemovementresearch.com/challenge/videomaterial/).

In order to access this page, you first must register for the challenge (see: http://saga.eyemovementresearch.com/challenge/register-for-the-saga-challenge/).


All submissions will be evaluated by an independent jury according to the evaluation criteria (see Workshop Website). Additionally, there is a live session scheduled for the third day in which all selected solutions can be demonstrated to the interested workshop participants. The three best solutions will receive an award.

Prize money:

1. Price: 1,000,- €
2. Price: 500,- €
3. Price: 250,- €

We would like to thank our premium sponsor SensoMotoric Instruments (SMI) for the contribution of prize money and test videos recorded with SMI’s mobile eye tracking glasses (www.eyetracking-glasses.com).

We would also like to thank our sponsor Tobii Technologies for supporting the live demo workshop session and for the provided Tobii Glasses (http://www.tobii.com/en/eye-tracking-research/global/products/hardware/tobii-glasses-eye-tracker/) test videos.

Please Note: All challenge participants must register separately at http://saga.eyemovementresearch.com/challenge/register-for-the-saga-challenge/ for access to the challenge material and the video download.


SAGA 2013 Workshop Organising Committee:

Workshop Organisers: Kai Essig, Thies Pfeiffer, Pia Knoeferle, Helge Ritter, Thomas Schack and Werner Schneider. All from Bielefeld University, Germany
Scientific Board: Thomas Schack, Helge Ritter and Werner Schneider
Jury of the Challenge: Kai Essig, Thies Pfeiffer, Pia Knoeferle and Denis Williams (Sensomotoric Instruments, SMI).

Please visit the website periodically for updates (http://saga.eyemovementresearch.com/about-saga/)
For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to Bielefeld in October, 2013!

Call for Papers: The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
avatar

Roman Bednarik posted the following call for papers on the Eye-Movement mailing list:

Call for Papers:

The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
at ACM ICMI 2013, Sydney, Australia

December 13, 2013
Papers deadline: August 31, 2013
www: http://cs.uef.fi/gazein2013

Invited speaker: Julien Epps – University of New South Wales, Sydney, Australia

Eye gaze is one of the most important aspects in understanding and modeling human-human communication, and it has great potential also in improving human-machine and robot interaction. In human face-to-face communication, eye gaze plays an important role in floor and turn management, grounding, and engagement in conversation. In human-computer interaction research, social gaze, gaze directed at an interaction partner, has been a subject of increased attention.

This is the sixth workshop in Eye Gaze in Intelligent Human Machine Interaction, and in the past we have discussed a wide range of issues for eye gaze relevant to multimodal interaction; technologies for sensing human attentional behaviors, roles of attentional behaviors as social gaze in human-human and human-machine/robot interaction, attentional behaviors in problem-solving and task-performing, gaze-based intelligent user interfaces, and evaluation of gaze-based UI. In addition to these topics, this workshop will focus on eye gaze in multimodal communication, interpretation and generation. Since eye gaze is one of the primary communication modalities, gaze information can be combined with other modalities to compensate meanings of utterances or to serve as a stand-alone communication signal.

GazeIn’13 aims to continue in these lines and explore the growing area of gaze in intelligent interaction research by bringing together researchers from domains of human sensing, multimodal processing, humanoid interfaces, intelligent user interfaces, and communication science. We will exchange ideas to develop and improve methodologies for this research area with the long-term goal of establishing a strong interdisciplinary research community in “attention aware interactive systems”.

This workshop solicits papers that address the following topics (but not limited to):

• Technologies and methods for sensing and interpretation of gaze and human attentional behaviors
• Eye gaze in multimodal generation and behavior production in conversational humanoids
• Empirical studies of attentional behaviors
• New directions for gaze in Multimodal interaction
• Evaluation and design issues for using eye gaze in multimodal interfaces

Please see the online CfP for a full list of topics (http://cs.uef.fi/gazein2013/call-for-papers)

SUBMISSION INFORMATION
There are two categories of paper submissions.
Long paper: The maximum length is 6 pages.
Short paper: The maximum length is 3 pages.

At least three members of the program committee will review each submission. The accepted papers will be published in the workshop proceedings. Best papers will be selected for an inclusion to a special issue in a journal. Submitted papers should conform to the ACM publication format. For templates and examples follow the link: http://www.acm.org/sigs/pubs/proceed/template.html

Please submit your papers using https://precisionconference.com/~icmi13j

IMPORTANT DATES
Paper submission due: August 31, 2013
Notification of acceptance: September 20, 2013
Camera-ready due: October 10, 2013
Workshop date: December 13, 2013

ORGANIZERS
Roman Bednarik – University of Eastern Finland, Finland
Hung-Hsuan Huang – Ritsumeikan University, Japan
Kristiina Jokinen – University of Helsinki, Finland
Yukiko Nakano – Seikei University, Japan


————————————————————————
Roman Bednarik http://cs.uef.fi/~rbednari
School of Computing, University of Eastern Finland
————————————————————————

Call for Papers: ETRA 2014
avatar

The call for the ETRA 2014 is out:

EYE TRACKING RESEARCH & APPLICATIONS SYMPOSIUM – ETRA 2014

http://www.etra2014.org

March 26th – 28th, 2014 Safety Harbor Resort & Spa Safety Harbor, FL,
USA

1ST CALL FOR PAPERS

The eighth ACM Symposium on Eye Tracking Research & Applications (ETRA
2014) will be held in Safety Harbor, Florida, on March 26th-28th,
2014. The ETRA conference series focuses on eye movement research and
applications across a wide range of disciplines. The symposium
presents research that advances the state-of-the-art in these areas,
leading to new capabilities in gaze tracking systems, gaze aware
applications, gaze based interaction, eye movement data analysis, etc.
For ETRA 2014, we invite papers in all areas of eye tracking research
and applications.

IMPORTANT DATES

20 Sept 2013: Paper abstracts due
4 Oct 2013: Full & short papers due
8 Nov 2013: Paper acceptance
29 Nov 2013: Paper revisions due
20 Dec 2013: Final paper acceptance due
6 Jan 2014: Doctoral Symposium submission due Video & Demo submission due
24 Jan 2014: Doctoral Symposium, video & demo acceptance due
31 Jan 2014: Camera ready papers due

RESEARCH AREAS OF INTEREST

*Eye Tracking Technology* Advances in eye tracking hardware, software
and algorithms such as: 2D and 3D eye tracking systems, calibration,
low cost eye tracking, natural light eye tracking, predictive models,
etc.

*Eye Tracking Data Analysis* Methods, procedures and analysis tools
for processing raw gaze data as well as fixations and gaze patterns.
Example topics are: scan path analysis, fixation detection algorithms,
and visualization techniques of gaze data.

*Visual Attention and Eye Movement Control* Applied and experimental
studies investigating visual attention and eye movements to gain
insight in eye movement control, cognition and attention, or for
design evaluation of visual stimuli. Examples are: usability and web
studies using eye tracking, and eye movement behavior in everyday
activities such as driving and reading.

*Eye Tracking Applications* Eye tracking as a human-computer input
method, either as a replacement to traditional input methods or as a
complement. Examples are: assistive technologies, gaze enhanced
interaction and interfaces, multimodal interaction, gaze in augmented
and mixed reality systems, gaze-contingent displays and gaze-based
biometric applications.

SUBMISSION CATEGORIES

*Research papers:* Authors are invited to submit original work in the
formats of Full paper (8 pages) and Short paper (4 pages). The papers
will undergo a rigorous review process assessing the originality and
quality of the work as well as the relevance for eye tracking research
and applications. Papers presented at ETRA 2014 will be available in
the ACM digital library. Submission formats and instructions are
available at the conference web site.

IMPORTANT NOTE: The submission process is different from past years,
in that there is a single deadline for both long and short papers.
Given the outcome of the first review process, papers may be invited
to resubmit for an additional review. In some cases, full papers may
be offered to resubmit their paper as a short paper.

*Doctoral Symposium (NEW to ETRA 2014):* ETRA 2014 is proud to
introduce the ETRA Doctoral Symposium, where graduate students get an
opportunity to meet other students and experienced researchers to get
feedback on their research in a friendly environment. We invite
doctoral students, who have a defined topic in the area of eye
tracking research and applications, but whose work is still in a phase
where it can be influenced by the feedback received in the symposium.
Participants will be selected based on a 2-page extended abstract
describing the thesis work and its current status.

*Demo/video track(NEW to ETRA 2014):* Have a gaze interaction
technique to share? Or want to show off how your new you tracking
method works? At ETRA 2014, we are adding a demo/video session where
researchers give demonstrations of their research or show videos of
their work. To take part in this session, we request a 2-page extended
abstract to be submitted. If authors have a full or short paper
accepted, no extended abstract is needed. If submitting for a video
presentation, the video is required.

CONFERENCE VENUE

ETRA 2014 will be held at the historic Safety Harbor Resort and Spa in
Safety Harbor Florida, a resort hotel sitting on top of three natural
springs facing the Tampa Bay, located on the beautiful Florida Golf
coast.

SPONSORSHIP

ETRA 2014 is co-sponsored by the ACM Special Interest Group in
Computer-Human Interaction (SIGCHI), and the ACM Special Interest
Group on Computer Graphics and Interactive Techniques (SIGGRAPH).

CONFERENCE CO-CHAIRS Dan Witzner Hansen, IT University Denmark
Pernilla Qvarfordt, FX Palo Alto Laboratory, Inc.

PROGRAM CO-CHAIRS Joe Goldberg – Oracle, Inc. Jeffrey B. Mulligan –
NASA, USA

DOCTORAL SYMPOSIUM CO-CHAIRS Päivi Majaranta, Univesity of Tampere,
Finland Jeff B. Pelz, University of Rochester, USA

DEMO/VIDEO CHAIR Oleg Komogortsev, Texas State University, USA

Call for Contributions: Challenge on Automatic Object Identification (AOI) and Tracking
avatar

1. Call for Challenge on Automatic Object Identification (AOI) and Tracking

as part of the 

SAGA 2013:
1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS
 - uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence 

Workshop Website:
http://saga.eyemovementresearch.com/challenge/

===========================================================================

Important Dates:

August 15th, 2013:   Deadline for 2-page abstract sketching your
                     approach.
September 2nd, 2013: Notification of acceptance for challenge.
October 2nd, 2013:   Submission of the final abstracts and final 
                     results.

October 24-26, 2013: Challenge results presentation takes place at the 
                     SAGA 2013 Workshop at Bielefeld University, 
                     Germany.

===========================================================================

We are very pleased to publish this call for challenge contributions
as part of the SAGA 2013 1st International Workshop on Solutions for
Automatic Gaze Data Analysis. The challenge will focus on software
solutions for automatic object recognition as a trailblazer for
vision-based object and person tracking algorithms. The automatic
object or person recognition and tracking in video sequences (in real-
time) is a key condition for many application fields, such as mobile
service robotics, Human-Robot Interaction (HRI), Computer Vision,
Digital Image Processing, autonomous assistance and surveillance
systems (e.g., driver assistance systems) and Eye Tracking.
Applications vary from tracking of objects (e.g., manipulating or
recognition of objects in dynamic scenes), body parts (e.g., head or
hand tracking for mimic and gesture classification), and persons
(e.g., person reidentification or visual following).

Although, many efficient tracking methods have been introduced for
different tasks over the last years, they are mostly restricted
towards particular environmental settings and therefore cannot be
applied to general application fields. This is due to a range of
factors: 1.) Often, underlying assumptions about the environment
cannot be met, including static background, no changes in lighting and
inhomogeneous or invariant appearances. These idealized conditions are
usually missing for object tracking in high dynamic environments, as
they are common, for example in mobile scenarios. 2.) Object models
cannot be applied because of the high variance in the appearance of
tracked persons or objects. 3.) Most algorithms are computationally
quite expensive (large systems demand often hard computational
restrictions for the used algorithms).

===========================================================================

Details on the SAGA 2013 
CHALLENGE on Automatic Object Identification (AOI) and Tracking:

In order to drive research on software solutions for the automatic
annotation of videos we offer a special challenge on this topic.
The purpose of the challenge is to encourage the community to work on a
set of specific software solutions and research questions and to
continuously improve on earlier results obtained for these problems over
the years. This will hopefully not only push the field as a whole and
increase the impact of work published in it, but also contribute open
source hardware, methods and data analysis software back to the
community. 

For the challenge we adress this topic on the basis of eye-tracking
data. Therefore, we are providing a set of test videos (duration 2-3
minutes) and separate text files with the corresponding gaze data on
the workshop website for which solutions should be written. These gaze
videos, recorded by a scene camera attached to an eye-tracking system,
show people when they look at objects or interact with them in mobile
applications. The gaze data contains a time-stamped list of x- and y-
positions of the gaze points (in the coordinate system of the scene
video). For selected videos, frame counter information will be also
available to assist with synchronization of the video and the gaze
data.

For the challenge we are looking for semi- and fully-automatic
software solutions for the recognition and tracking of objects over
the whole video sequence. The software should provide the coordinates
for the tracked objects and use this information to automatically
calculate object specific gaze data, such as number of fixations and
cumulative fixation durations, by using the time-stamped list of 2D
gaze coordinates in the eye-tracking file. There are no restrictions
on the way in which the relevant objects are marked and on which kind
of techniques can be used to track the objects. The only constraint is
that your software solution can read and process the provided videos
and reports gaze specific data for the selected objects either as a
text file (which can serve as input for a statistical program such as
SPSS, Matlab, R oder MS Excel) or by providing some kind of
visualization.

All submissions will be evaluated by an independent jury according to
the evaluation criteria (see below). Additionally, there is a live
session scheduled for the third day in which all selected solutions
can be demonstrated to the interested workshop participants. The three
best solutions will receive an award.

Prize money:

1. Price: 1.000,- €
2. Price:   500,- €
3. Price:   250,- €

We would like to thank our premium sponsor SensoMotoric Instruments
(SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices
from
- SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
- Tobii Technologies [Tobii Glasses]
- Applied Science Laboratories (ASL)
  / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]

===========================================================================

Submissions:

In order to allow for more time for the implementation process for the
challenge a two-step submission procedure has been devised. The decision
for acceptance to the challenge will be on a preliminary submitted
abstract. The final evaluation and ranking of the software solutions
will be based on the final abstract and the final results for a test-set
of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract
describing the implementation details of your proposed software solution
including the following:

- description of the underlying techniques and implementations
- description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3
page paper by adding the following details:

- number of fixations and cumulative fixation duration details for the
  specified objects
- performance data (such as computation time, number of selected
  objects, parallel tracking of several objects in the scene)
- snapshot of the results

We will use results based on manual annotation to evaluate the submitted
results. The following evaluation criteria will be applied:

- quality of the automated benchmark results (region and pixel based)
  compared to the results given by manual annotation
- conceptual innovation
- performance (such as computation time, number of selected objects,
  parallel tracking of several objects in the scene)
- robustness (such as such as tracking performance, general scope of
  the application)
- usability

The test videos and a corresponding description of them can be found on
the workshop website. Additionally, you can find a detailed description
of how we perform the manual annotation. The exact description for the
challenge, including the evaluation criteria and the required format for
the results, will appear on the workshop website within the next 3
weeks. Please check the website regularly for updates.

Abstracts will be peer-viewed by at least two members of an
international program committee. We will provide templates on the
workshop website. We are currently pursuing possible options for
publication of a special issue in a journal or as an edited volume.

Please Note: All challenge participants must register separately for
access to the challenge material and the video download.

===========================================================================

We would like to thank our commercial sponsors:

Premium Sponsors
- SensoMotoric Instruments (SMI) [challenge]
  / SMI Eye Tracking Glasses (www.eyetracking-glasses.com)

Sponsors
- Tobii Technologies [live demo workshop session]
  / Tobii Glasses (http://www.tobii.com/en/eye-tracking-
  research/global/products/hardware/tobii- glasses-eye-tracker/)

===========================================================================

Challenge Organising Committee:

Workshop Organisers:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Helge Ritter
- Thomas Schack
- Werner Schneider

All from the
Cognitive Interaction Technology Center of Excellence
at Bielefeld University

Scientific Board:
- Thomas Schack
- Helge Ritter
- Werner Schneider

Jury of the Challenge:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:
http://saga.eyemovementresearch.com/about-saga/

For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the organisers

Thies Pfeiffer