Call for Papers: 4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
avatar

Call for Papers
===============

4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction (PETMEI 2014)
– in conjunction with UbiComp 2014

You are cordially invited to submit original work at the PETMEI 2014 Workshop. The workshop will be held in Seattle on September 13th, 2014.

Location: Seattle, United States
Date: September 13th, 2014

IMPORTANT DATES
– Abstract Submission: June 3, 2014
– Paper Submission: June 10, 2014
– Notification of Acceptance: June 24, 2014
– Camera-ready due: July 1, 2014
– Workshop: September 13, 2014

VISION AND GOALS
Despite considerable advances over the last decades, previous work on eye tracking and eye-based human-computer interfaces mainly developed use of the eyes in traditional desktop settings. Latest developments in remote and headmounted eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that will become pervasively usable in everyday life. With the growth of interest in smart glass devices and low-cost eye trackers, gaze-based techniques for mobile computing is becoming increasingly important in recent years. We call this new paradigm pervasive eye tracking – continuous eye monitoring and analysis 24/7.

The potential applications for the ability to track and analyse eye movements anywhere and any time call for new research to further develop and understand visual behaviour and eyebased interaction in daily life settings. PETMEI 2014 will focus on pervasive eye tracking as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We provide a forum for researchers from human-computer interaction, context-aware computing, egocentric computer vision and eye tracking to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, and new applications for pervasive eye tracking in ubiquitous computing. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on pervasive eye tracking.

TOPICS
Topics of interest cover computational methods, new applications and use cases, as well as eye tracking technology for pervasive eye tracking and mobile eye-based interaction. Topics of interest include, but are not limited to:

Methods
We invite participants to reflect on the specific characteristics of pervasive eye tracking systems and to contrast them with classical methods for eye tracking, eye movement analysis, eye-based interaction, and evaluation. We welcome contributions reporting on methodological advances on all components of mobile eye tracking systems, and the workshop will also cover latest technological advances in mobile eye tracking equipment.
– Eye tracking technologies for mobile devices
– Tools for face, eye detection and tracking
– Gaze and eye movement analysis methods
– Integration of pervasive eye tracking and context-aware computing
– Multi-modal sensor fusion
– User studies on pervasive eye tracking
– Devices for portable, wearable and ambient eye tracking

Applications
In addition to contributions reporting on methodological advances we also want to attract submissions that explore innovative applications of pervasive eye tracking and mobile eye-based interaction. We also want to particularly invite presentations on research on egocentric vision systems and gazerelated computer vision applications that can potentially extend the possibility of current mobile gaze interaction.
– Pervasive eye-based interaction
– Mobile attentive user interfaces
– Eye-based activity and context recognition
– Security and privacy for pervasive eye-tracking systems
– Eye tracking for specialized application areas
– Eye-based human-robot and human-agent interaction
– Cognition-aware systems and user interfaces
– Human factors in mobile eye-based interaction
– Egocentric computer-vision systems and applications

SUBMISSION GUIDELINES
Prospective authors should submit notes with a maximum length of four pages or full papers with a maximum length of six pages. In addition to research papers we explicitly invite submissions of position papers and papers that describe preliminary results or work-in-progress. All submissions should be prepared according to the SIGCHI archival format (double column, PDF). Manuscripts will be reviewed by at least two reviewers. Accepted papers will be published in the UbiComp 2014 supplemental proceedings. In addition, printed proceedings will be distributed to the participants during the workshop. We also plan to publish extended versions of selected papers in an edited book or a special issue of a journal or magazine.

Templates
– Latex http://www.sigchi.org/publications/chipubform/sigchi-papers-latex-template/at_download/file
– Word http://www.sigchi.org/publications/chipubform/sigchi-papers-word-template/at_download/file

Submission Website
Please visit our website http://2014.petmei.org/ for regular updates.

ORGANIZERS
– Thies Pfeiffer, Center of Excellence Cognitive Interaction Technology, Bielefeld University, Germany
– Sophie Stellmach, Microsoft Corporation, USA
– Yusuke Sugano, The University of Tokyo, JP

PROGRAM COMMITTEE
PETMEI 2014 is supported by the following program committee members:
– Andreas Bulling, Max Planck Institute for Informatics, DE
– Andrew T. Duchowski, Clemson University, USA
– Alireza Fathi, Stanford University, USA
– Dan Witzner Hansen, IT University of Copenhagen, DK
– Kris M. Kitani, Carnegie Mellon University, USA
– Päivi Majaranta, University of Tampere, FI
– Lucas Paletta, Joanneum, AT
– Pernilla Qvarfordt, FX Palo Alto Laboratory, US
– Lech Swirski, University of Cambridge, UK
– Takumi Toyama, DFKI, DE

More committee members to be announced soon.

CONTACT AND FURTHER INFORMATION
For further information, please visit our website or sent us an email:
– Official PETMEI 2014 E-Mail: petmei2014@gmail.com
– PETMEI 2014 Website: http://2014.petmei.org/

Best regards
Thies, Sophie and Yusuke

Call for Participation at SAGA Workshop 2013
avatar

Call for Participation at SAGA Workshop 2013

1st International Workshop on Solutions for Automatic Gaze Data Analysis and Eyetracking Studies in Natural Environments

Where: Bielefeld University, Germany
When: October 24th – 25th, 2013

Why you would want to participate:

If you are an experimental researcher:
– Does manual annotation impede your research?
– Do you want to analyse mobile eye tracking data?
– Do you want to move from desktop-based to more
natural interaction scenarios?

If you are a computer scientist:
– Are you interested in gaze-based interaction?
– Are you an expert in tracking objects or reconstructing scenes?
– Are you seeking for an interesting field of application?

Then you should not miss the SAGA Workshop on 24th to 25th of October 2013 at the new CITEC facilities at Bielefeld University (Germany):

The aim of the workshop is to build a bridge between basic academic research and applied research, particularly in the fields of visual image analysis, and scene representation (object recognition and tracking), as well a the online analysis and interpretation of attention in context mobile studies on natural scene perception.

We are providing a forum for researchers from human-computer interaction, context-aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos and use of eye tracking in natural environment studies.

Talks:

Several researchers will present their works on solutions for the (semi-) automatic annotation of gaze videos and on eye movement studies in natural environments as a trailblazer for gaze analysis in natural environments, mobile eye-based interaction and eye-based context-awareness.

Keynote speakers:

Several outstanding keynote speakers from academics and industry in the fields of eye movement research in natural environments, leading usability service provider and assistive technologies in professional training processes for disabled people, already confirmed their participation at the SAGA 2013 workshop:

– Marc Pomplun, UMASS Boston, United States of America
– Ben Tatler, University of Dundee, Scotland
– Michael Schiessl, Managing Director of EyeSquare
(User & Brand Research) in Berlin, Germany
– Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld,
Germany

Live Demo Sessions:

As a particular highlight, several technical solutions (commercial or research-in-progress) for automatic gaze analysis will be demonstrated inbetween sessions:
– VideoGazer – A Modular Approach Towards Automatic Annotation
of Gaze Videos
– Location-based Online Identification of Objects in the
Centre of Visual Attention using Eye Tracking
– Various object recognition and tracking solutions from the
Robotics Group of the Center of Excellence “Cognitive Interaction
Technology”, Bielefeld University, Germany
– BeGaze from Sensomotoric Instruments (SMI)
– Mobile Eye Tracking paired with a mobile EEG solution
– … and more to come

During the live session, researchers will provide an in-depth demonstration of their solutions and they are pleased to answer any further questions you may have.

For more information and registration, please visit the workshop website at http://saga.eyemovementresearch.com/

We are looking forward to meet you at SAGA!

Kai Essig & Thies Pfeiffer

2nd Call for Papers: 1st International Workshop on Solutions for Automatic Gaze Data Analysis (CITEC/Bielefeld University)
avatar

SAGA 2013: 1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS –
uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Workshop Website: http://saga.eyemovementresearch.com/

The SAGA 2013 workshop is accepting abstracts for two calls: Challenge Contributions as well as Oral Presentation or Posters.
We are currently pursuing possible options for publication of a special issue in a journal or as an edited volume.


Important Dates:

1. Oral presentation / poster call:

August, 23, 2013: Deadline for abstract submissions.
September, 6, 2013: Notification of acceptance for talks and posters.

2. Challenge:

August, 23, 2013: Deadline for 2-page preliminary abstract sketching your approach.
September, 6, 2013: Notification of acceptance for challenge.
October, 2, 2013: Submission of the final abstracts and final results.

October, 24-26, 2013: Workshop takes place at Bielefeld University, Germany.


Invited keynote speakers from academia and industry:

Marc Pomplun, UMASS Boston, United States of America
Ben Tatler, University of Dundee, Scotland
Michael Schiessl, Managing Director of EyeSquare (User & Brand Research) in Berlin, Germany
Andreas Enslin, Head of Miele Design Centre in Gütersloh, Germany
Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld, Germany


We are very pleased to publish this second call for challenge contributions and abstracts for SAGA 2013, the 1st International Workshop on Automatic Annotation of Gaze Videos. SAGA 2013 will focus on automatic solutions for the automatic annotation of gaze videos and research work on eye movement analysis in natural environments as a trailblazer for mobile eye-based interaction and eye-based context-awareness.

We are providing a forum for researchers from human-computer interaction, context-aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction.

We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos.


Submissions:

Abstracts will be peer-viewed by at least two members of an international program committee. Word and LaTex templates for the submissions are now available on the workshop website and the registration is open.

########################################
1. Oral presentation / poster call: #
########################################

We are calling for 500-word abstracts on topics related to real-world eye tracking and eye movement analyses. Possible topics include, but are not limited to, eye tracking in human-machine interaction, visual search, language processing, eye-hand coordination, marketing, automatized tasks, and decision making.

Please note: All accepted contributions must register for the workshop.

################
2. Challenge: #
################

In order to drive research on software solutions for the automatic annotation of gaze videos we offer a special challenge on this topic. The purpose of the challenge is to encourage the community to work on a set of specific software solutions and research questions and to continuously improve on earlier results obtained for these problems over the years.

We are providing a set of test videos on the workshop website (duration 2-3 minutes) and separate text files with the corresponding gaze data for which solutions for semi- and fully-automatic software solutions for the recognition and tracking of objects over the whole video sequence shall be written. The software should provide the coordinates for the tracked objects and use this information to automatically calculate object specific gaze data, such as number of fixations and cumulative fixation durations. There are no restrictions on the way in which the relevant objects are marked and on which kind of techniques can be used to track the objects.

The only constraint is that your software solution can read and process the provided videos and reports gaze specific data for the selected objects either as a text file (which can serve as input for a statistical program such as SPSS, Matlab, R oder MS Excel) or by providing some kind of visualization.


Detailed information on the participation and a description of all necessary steps can be found on the workshop website (see: http://saga.eyemovementresearch.com/challenge/howto-participate-in-the-challenge/). Additionally, you can now find an explanation of the manual annotation procedures which we will be used to evaluate the submitted software solutions for the challenge(http://saga.eyemovementresearch.com/challenge/videomaterial/).

In order to access this page, you first must register for the challenge (see: http://saga.eyemovementresearch.com/challenge/register-for-the-saga-challenge/).


All submissions will be evaluated by an independent jury according to the evaluation criteria (see Workshop Website). Additionally, there is a live session scheduled for the third day in which all selected solutions can be demonstrated to the interested workshop participants. The three best solutions will receive an award.

Prize money:

1. Price: 1,000,- €
2. Price: 500,- €
3. Price: 250,- €

We would like to thank our premium sponsor SensoMotoric Instruments (SMI) for the contribution of prize money and test videos recorded with SMI’s mobile eye tracking glasses (www.eyetracking-glasses.com).

We would also like to thank our sponsor Tobii Technologies for supporting the live demo workshop session and for the provided Tobii Glasses (http://www.tobii.com/en/eye-tracking-research/global/products/hardware/tobii-glasses-eye-tracker/) test videos.

Please Note: All challenge participants must register separately at http://saga.eyemovementresearch.com/challenge/register-for-the-saga-challenge/ for access to the challenge material and the video download.


SAGA 2013 Workshop Organising Committee:

Workshop Organisers: Kai Essig, Thies Pfeiffer, Pia Knoeferle, Helge Ritter, Thomas Schack and Werner Schneider. All from Bielefeld University, Germany
Scientific Board: Thomas Schack, Helge Ritter and Werner Schneider
Jury of the Challenge: Kai Essig, Thies Pfeiffer, Pia Knoeferle and Denis Williams (Sensomotoric Instruments, SMI).

Please visit the website periodically for updates (http://saga.eyemovementresearch.com/about-saga/)
For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to Bielefeld in October, 2013!

Call for Papers: The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
avatar

Roman Bednarik posted the following call for papers on the Eye-Movement mailing list:

Call for Papers:

The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
at ACM ICMI 2013, Sydney, Australia

December 13, 2013
Papers deadline: August 31, 2013
www: http://cs.uef.fi/gazein2013

Invited speaker: Julien Epps – University of New South Wales, Sydney, Australia

Eye gaze is one of the most important aspects in understanding and modeling human-human communication, and it has great potential also in improving human-machine and robot interaction. In human face-to-face communication, eye gaze plays an important role in floor and turn management, grounding, and engagement in conversation. In human-computer interaction research, social gaze, gaze directed at an interaction partner, has been a subject of increased attention.

This is the sixth workshop in Eye Gaze in Intelligent Human Machine Interaction, and in the past we have discussed a wide range of issues for eye gaze relevant to multimodal interaction; technologies for sensing human attentional behaviors, roles of attentional behaviors as social gaze in human-human and human-machine/robot interaction, attentional behaviors in problem-solving and task-performing, gaze-based intelligent user interfaces, and evaluation of gaze-based UI. In addition to these topics, this workshop will focus on eye gaze in multimodal communication, interpretation and generation. Since eye gaze is one of the primary communication modalities, gaze information can be combined with other modalities to compensate meanings of utterances or to serve as a stand-alone communication signal.

GazeIn’13 aims to continue in these lines and explore the growing area of gaze in intelligent interaction research by bringing together researchers from domains of human sensing, multimodal processing, humanoid interfaces, intelligent user interfaces, and communication science. We will exchange ideas to develop and improve methodologies for this research area with the long-term goal of establishing a strong interdisciplinary research community in “attention aware interactive systems”.

This workshop solicits papers that address the following topics (but not limited to):

• Technologies and methods for sensing and interpretation of gaze and human attentional behaviors
• Eye gaze in multimodal generation and behavior production in conversational humanoids
• Empirical studies of attentional behaviors
• New directions for gaze in Multimodal interaction
• Evaluation and design issues for using eye gaze in multimodal interfaces

Please see the online CfP for a full list of topics (http://cs.uef.fi/gazein2013/call-for-papers)

SUBMISSION INFORMATION
There are two categories of paper submissions.
Long paper: The maximum length is 6 pages.
Short paper: The maximum length is 3 pages.

At least three members of the program committee will review each submission. The accepted papers will be published in the workshop proceedings. Best papers will be selected for an inclusion to a special issue in a journal. Submitted papers should conform to the ACM publication format. For templates and examples follow the link: http://www.acm.org/sigs/pubs/proceed/template.html

Please submit your papers using https://precisionconference.com/~icmi13j

IMPORTANT DATES
Paper submission due: August 31, 2013
Notification of acceptance: September 20, 2013
Camera-ready due: October 10, 2013
Workshop date: December 13, 2013

ORGANIZERS
Roman Bednarik – University of Eastern Finland, Finland
Hung-Hsuan Huang – Ritsumeikan University, Japan
Kristiina Jokinen – University of Helsinki, Finland
Yukiko Nakano – Seikei University, Japan


————————————————————————
Roman Bednarik http://cs.uef.fi/~rbednari
School of Computing, University of Eastern Finland
————————————————————————

Call for Papers: ETRA 2014
avatar

The call for the ETRA 2014 is out:

EYE TRACKING RESEARCH & APPLICATIONS SYMPOSIUM – ETRA 2014

http://www.etra2014.org

March 26th – 28th, 2014 Safety Harbor Resort & Spa Safety Harbor, FL,
USA

1ST CALL FOR PAPERS

The eighth ACM Symposium on Eye Tracking Research & Applications (ETRA
2014) will be held in Safety Harbor, Florida, on March 26th-28th,
2014. The ETRA conference series focuses on eye movement research and
applications across a wide range of disciplines. The symposium
presents research that advances the state-of-the-art in these areas,
leading to new capabilities in gaze tracking systems, gaze aware
applications, gaze based interaction, eye movement data analysis, etc.
For ETRA 2014, we invite papers in all areas of eye tracking research
and applications.

IMPORTANT DATES

20 Sept 2013: Paper abstracts due
4 Oct 2013: Full & short papers due
8 Nov 2013: Paper acceptance
29 Nov 2013: Paper revisions due
20 Dec 2013: Final paper acceptance due
6 Jan 2014: Doctoral Symposium submission due Video & Demo submission due
24 Jan 2014: Doctoral Symposium, video & demo acceptance due
31 Jan 2014: Camera ready papers due

RESEARCH AREAS OF INTEREST

*Eye Tracking Technology* Advances in eye tracking hardware, software
and algorithms such as: 2D and 3D eye tracking systems, calibration,
low cost eye tracking, natural light eye tracking, predictive models,
etc.

*Eye Tracking Data Analysis* Methods, procedures and analysis tools
for processing raw gaze data as well as fixations and gaze patterns.
Example topics are: scan path analysis, fixation detection algorithms,
and visualization techniques of gaze data.

*Visual Attention and Eye Movement Control* Applied and experimental
studies investigating visual attention and eye movements to gain
insight in eye movement control, cognition and attention, or for
design evaluation of visual stimuli. Examples are: usability and web
studies using eye tracking, and eye movement behavior in everyday
activities such as driving and reading.

*Eye Tracking Applications* Eye tracking as a human-computer input
method, either as a replacement to traditional input methods or as a
complement. Examples are: assistive technologies, gaze enhanced
interaction and interfaces, multimodal interaction, gaze in augmented
and mixed reality systems, gaze-contingent displays and gaze-based
biometric applications.

SUBMISSION CATEGORIES

*Research papers:* Authors are invited to submit original work in the
formats of Full paper (8 pages) and Short paper (4 pages). The papers
will undergo a rigorous review process assessing the originality and
quality of the work as well as the relevance for eye tracking research
and applications. Papers presented at ETRA 2014 will be available in
the ACM digital library. Submission formats and instructions are
available at the conference web site.

IMPORTANT NOTE: The submission process is different from past years,
in that there is a single deadline for both long and short papers.
Given the outcome of the first review process, papers may be invited
to resubmit for an additional review. In some cases, full papers may
be offered to resubmit their paper as a short paper.

*Doctoral Symposium (NEW to ETRA 2014):* ETRA 2014 is proud to
introduce the ETRA Doctoral Symposium, where graduate students get an
opportunity to meet other students and experienced researchers to get
feedback on their research in a friendly environment. We invite
doctoral students, who have a defined topic in the area of eye
tracking research and applications, but whose work is still in a phase
where it can be influenced by the feedback received in the symposium.
Participants will be selected based on a 2-page extended abstract
describing the thesis work and its current status.

*Demo/video track(NEW to ETRA 2014):* Have a gaze interaction
technique to share? Or want to show off how your new you tracking
method works? At ETRA 2014, we are adding a demo/video session where
researchers give demonstrations of their research or show videos of
their work. To take part in this session, we request a 2-page extended
abstract to be submitted. If authors have a full or short paper
accepted, no extended abstract is needed. If submitting for a video
presentation, the video is required.

CONFERENCE VENUE

ETRA 2014 will be held at the historic Safety Harbor Resort and Spa in
Safety Harbor Florida, a resort hotel sitting on top of three natural
springs facing the Tampa Bay, located on the beautiful Florida Golf
coast.

SPONSORSHIP

ETRA 2014 is co-sponsored by the ACM Special Interest Group in
Computer-Human Interaction (SIGCHI), and the ACM Special Interest
Group on Computer Graphics and Interactive Techniques (SIGGRAPH).

CONFERENCE CO-CHAIRS Dan Witzner Hansen, IT University Denmark
Pernilla Qvarfordt, FX Palo Alto Laboratory, Inc.

PROGRAM CO-CHAIRS Joe Goldberg – Oracle, Inc. Jeffrey B. Mulligan –
NASA, USA

DOCTORAL SYMPOSIUM CO-CHAIRS Päivi Majaranta, Univesity of Tampere,
Finland Jeff B. Pelz, University of Rochester, USA

DEMO/VIDEO CHAIR Oleg Komogortsev, Texas State University, USA

Call for Contributions: Challenge on Automatic Object Identification (AOI) and Tracking
avatar

1. Call for Challenge on Automatic Object Identification (AOI) and Tracking

as part of the 

SAGA 2013:
1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS
 - uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence 

Workshop Website:
http://saga.eyemovementresearch.com/challenge/

===========================================================================

Important Dates:

August 15th, 2013:   Deadline for 2-page abstract sketching your
                     approach.
September 2nd, 2013: Notification of acceptance for challenge.
October 2nd, 2013:   Submission of the final abstracts and final 
                     results.

October 24-26, 2013: Challenge results presentation takes place at the 
                     SAGA 2013 Workshop at Bielefeld University, 
                     Germany.

===========================================================================

We are very pleased to publish this call for challenge contributions
as part of the SAGA 2013 1st International Workshop on Solutions for
Automatic Gaze Data Analysis. The challenge will focus on software
solutions for automatic object recognition as a trailblazer for
vision-based object and person tracking algorithms. The automatic
object or person recognition and tracking in video sequences (in real-
time) is a key condition for many application fields, such as mobile
service robotics, Human-Robot Interaction (HRI), Computer Vision,
Digital Image Processing, autonomous assistance and surveillance
systems (e.g., driver assistance systems) and Eye Tracking.
Applications vary from tracking of objects (e.g., manipulating or
recognition of objects in dynamic scenes), body parts (e.g., head or
hand tracking for mimic and gesture classification), and persons
(e.g., person reidentification or visual following).

Although, many efficient tracking methods have been introduced for
different tasks over the last years, they are mostly restricted
towards particular environmental settings and therefore cannot be
applied to general application fields. This is due to a range of
factors: 1.) Often, underlying assumptions about the environment
cannot be met, including static background, no changes in lighting and
inhomogeneous or invariant appearances. These idealized conditions are
usually missing for object tracking in high dynamic environments, as
they are common, for example in mobile scenarios. 2.) Object models
cannot be applied because of the high variance in the appearance of
tracked persons or objects. 3.) Most algorithms are computationally
quite expensive (large systems demand often hard computational
restrictions for the used algorithms).

===========================================================================

Details on the SAGA 2013 
CHALLENGE on Automatic Object Identification (AOI) and Tracking:

In order to drive research on software solutions for the automatic
annotation of videos we offer a special challenge on this topic.
The purpose of the challenge is to encourage the community to work on a
set of specific software solutions and research questions and to
continuously improve on earlier results obtained for these problems over
the years. This will hopefully not only push the field as a whole and
increase the impact of work published in it, but also contribute open
source hardware, methods and data analysis software back to the
community. 

For the challenge we adress this topic on the basis of eye-tracking
data. Therefore, we are providing a set of test videos (duration 2-3
minutes) and separate text files with the corresponding gaze data on
the workshop website for which solutions should be written. These gaze
videos, recorded by a scene camera attached to an eye-tracking system,
show people when they look at objects or interact with them in mobile
applications. The gaze data contains a time-stamped list of x- and y-
positions of the gaze points (in the coordinate system of the scene
video). For selected videos, frame counter information will be also
available to assist with synchronization of the video and the gaze
data.

For the challenge we are looking for semi- and fully-automatic
software solutions for the recognition and tracking of objects over
the whole video sequence. The software should provide the coordinates
for the tracked objects and use this information to automatically
calculate object specific gaze data, such as number of fixations and
cumulative fixation durations, by using the time-stamped list of 2D
gaze coordinates in the eye-tracking file. There are no restrictions
on the way in which the relevant objects are marked and on which kind
of techniques can be used to track the objects. The only constraint is
that your software solution can read and process the provided videos
and reports gaze specific data for the selected objects either as a
text file (which can serve as input for a statistical program such as
SPSS, Matlab, R oder MS Excel) or by providing some kind of
visualization.

All submissions will be evaluated by an independent jury according to
the evaluation criteria (see below). Additionally, there is a live
session scheduled for the third day in which all selected solutions
can be demonstrated to the interested workshop participants. The three
best solutions will receive an award.

Prize money:

1. Price: 1.000,- €
2. Price:   500,- €
3. Price:   250,- €

We would like to thank our premium sponsor SensoMotoric Instruments
(SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices
from
- SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
- Tobii Technologies [Tobii Glasses]
- Applied Science Laboratories (ASL)
  / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]

===========================================================================

Submissions:

In order to allow for more time for the implementation process for the
challenge a two-step submission procedure has been devised. The decision
for acceptance to the challenge will be on a preliminary submitted
abstract. The final evaluation and ranking of the software solutions
will be based on the final abstract and the final results for a test-set
of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract
describing the implementation details of your proposed software solution
including the following:

- description of the underlying techniques and implementations
- description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3
page paper by adding the following details:

- number of fixations and cumulative fixation duration details for the
  specified objects
- performance data (such as computation time, number of selected
  objects, parallel tracking of several objects in the scene)
- snapshot of the results

We will use results based on manual annotation to evaluate the submitted
results. The following evaluation criteria will be applied:

- quality of the automated benchmark results (region and pixel based)
  compared to the results given by manual annotation
- conceptual innovation
- performance (such as computation time, number of selected objects,
  parallel tracking of several objects in the scene)
- robustness (such as such as tracking performance, general scope of
  the application)
- usability

The test videos and a corresponding description of them can be found on
the workshop website. Additionally, you can find a detailed description
of how we perform the manual annotation. The exact description for the
challenge, including the evaluation criteria and the required format for
the results, will appear on the workshop website within the next 3
weeks. Please check the website regularly for updates.

Abstracts will be peer-viewed by at least two members of an
international program committee. We will provide templates on the
workshop website. We are currently pursuing possible options for
publication of a special issue in a journal or as an edited volume.

Please Note: All challenge participants must register separately for
access to the challenge material and the video download.

===========================================================================

We would like to thank our commercial sponsors:

Premium Sponsors
- SensoMotoric Instruments (SMI) [challenge]
  / SMI Eye Tracking Glasses (www.eyetracking-glasses.com)

Sponsors
- Tobii Technologies [live demo workshop session]
  / Tobii Glasses (http://www.tobii.com/en/eye-tracking-
  research/global/products/hardware/tobii- glasses-eye-tracker/)

===========================================================================

Challenge Organising Committee:

Workshop Organisers:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Helge Ritter
- Thomas Schack
- Werner Schneider

All from the
Cognitive Interaction Technology Center of Excellence
at Bielefeld University

Scientific Board:
- Thomas Schack
- Helge Ritter
- Werner Schneider

Jury of the Challenge:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:
http://saga.eyemovementresearch.com/about-saga/

For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the organisers

Thies Pfeiffer

Call for Papers: 1st International Workshop on Solutions for Automatic Gaze Data Analysis (CITEC/Bielefeld University)
avatar

1. Call for papers and challenge contributions:

SAGA 2013:
1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS
– uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence Workshop Website:
http://saga.eyemovementresearch.com/

The SAGA 2013 workshop is accepting abstracts for two calls: Challenge
Contributions as well as Oral Presentation or Posters. We are currently
pursuing possible options for publication of a special issue in a
journal or as an edited volume.

=================================================================

Important Dates:

1. Oral presentation / poster call:

August, 15, 2013: Deadline for abstract submissions.
September, 2, 2013: Notification of acceptance for talks and posters.

2. Challenge:

August, 15, 2013: Deadline for 2-page abstract sketching your approach.
September, 2, 2013: Notification of acceptance for challenge.
October, 2, 2013: Submission of the final abstracts and final results.

October, 24-26, 2013: Workshop takes place at Bielefeld University,
Germany.

=================================================================

Invited keynote speakers from academia and industry:

– Marc Pomplun, UMASS Boston, United States of America
– Ben Tatler, University of Dundee, Scotland
– Michael Schiessl, Managing Director of EyeSquare in Berlin, Germany
– Andreas Enslin, Head of Miele Design Centre in Gütersloh, Germany
– Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld,
Germany

=================================================================

We are very pleased to publish this call for challenge contributions and abstracts for SAGA 2013, the 1st International Workshop on Automatic Annotation of Gaze Videos. SAGA 2013 will focus on automatic solutions for gaze videos as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We are providing a forum for researchers from human-computer interaction, context- aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos.

=================================================================

SAGA 2013 CHALLENGE:

In order to drive research on software solutions for the automatic annotation of gaze videos we offer a special challenge on this topic. The purpose of the challenge is to encourage the community to work on a set of specific software solutions and research questions and to continuously improve on earlier results obtained for these problems over the years. This will hopefully not only push the field as a whole and increase the impact of work published in it, but also contribute open source hardware, methods and gaze data analysis software back to the community. We are providing a set of test videos on the workshop website for which solutions should be written. All submissions will be evaluated by an independent jury according to the evaluation criteria (see below). Additionally, there is a live session scheduled for the third day in which all selected solutions can be demonstrated to the interested workshop participants. The three best solutions will receive an award.

Prize money:

1. Price: 1,000,- €
2. Price: 500,- €
3. Price: 250,- €

We would like to thank our premium sponsor SensoMotoric Instruments (SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices from

  • SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
  • Tobii Technologies [Tobii Glasses]
  • Applied Science Laboratories (ASL) / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]

=================================================================

Submissions:

Abstracts will be peer-viewed by at least two members of an international program committee. We will provide templates on the workshop website.

########################################
# 1. Oral presentation / poster call:
########################################

We are calling for 500-word abstracts on topics related to real-world eye tracking and eye movement analyses. Possible topics include, but are not limited to, eye tracking in human-machine interaction, visual search, language processing, eye-hand coordination, marketing, automatized tasks, and decision making.

Please note: All accepted contributions must register for the workshop.

################
# 2. Challenge:
################

We will provide test videos (duration 2-3 minutes) and separate text files with the corresponding gaze data on the workshop website. The gaze data consists of a timestamped list of (x,y) gaze scene video coordinates. For selected videos, frame counter information will be also available to assist with synchronization of the video and the gaze data. For the challenge we are looking for semi- and fully-automatic software
solutions for the recognition and tracking of objects over the whole video sequence. The software should provide the coordinates for the tracked objects and use this information to automatically calculate object specific gaze data, such as number of fixations and cumulative fixation durations. There are no restrictions on the way in which the relevant objects are marked and on which kind of techniques can be used to track the objects. The only constraint is that your software solution can read and process the provided videos and reports gaze specific data for the selected objects either as a text file (which can serve as input for a statistical program such as SPSS, Matlab, R oder MS Excel) or by
providing some kind of visualization. In order to allow for more time for the implementation process for the challenge a two-step submission procedure has been devised. The decision for acceptance to the challenge will be on a preliminary submitted abstract. The final evaluation and ranking of the software solutions will be based on the final abstract and the final results for a test-set of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract describing the implementation details of your proposed software solution including the following:

  • description of the underlying techniques and implementations
  • description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3page paper by adding the following details:

  • number of fixations and cumulative fixation duration details for the specified objects
  • performance data (such as computation time, number of selected objects, parallel tracking of several objects in the scene)
  • snapshot of the results

We will use results based on manual annotation to evaluate the submitted results. The following evaluation criteria will be applied:

  • quality of the automated benchmark results (region and pixel based) compared to the results given by manual annotation
  • conceptual innovation
  • performance (such as computation time, number of selected objects, parallel tracking of several objects in the scene)
  • robustness (such as such as tracking performance, general scope of the application)
  • usability

The test videos and a corresponding description of them can be found on the workshop website. Additionally, you can find a detailed description of how we perform the manual annotation. The exact description for the challenge, including the evaluation criteria and the required format for the results, will appear on the workshop website within the next 3 weeks. Please check the website regularly for updates.

Please Note: All challenge participants must register separately for access to the challenge material and the video download.

=================================================================

Location:

The SAGA 2013 workshop will be held at the new CITEC Research Building Interactive Intelligent Systems, which is located close to the main building of Bielefeld University. The construction of the research building ‘Interactive Intelligent Systems’ started in January 2011 and the completion will be in summer 2013. By end of the year, the research building will host 17 research groups from various disciplines such as informatics, engineering, linguistics, psychology, and sports science. It will be completed by a conference center built to accommodate up to 200 participants that has been planned as an internationally visible hallmark of this highly profiled research site (taken from: https://www.cit-ec.de/FBIIS).

Bielefeld University was founded in 1969 with an explicit research assignment and a mission to provide high-quality research-oriented teaching. Today it encompasses 13 faculties covering a broad spectrum of disciplines in the humanities, natural sciences, social sciences, and technology. With about 18,500 students in 80 degree courses and 2,600 staff (including approx. 1,480 academic staff), it is one of Germany’s best known medium-sized universities.

Bielefeld, the centre for science, is the economic and cultural capital of the East Westphalia economic area. The city of Bielefeld is one of the twenty largest cities in Germany, with a population of 325,000. This lively university city on the edge of the Teutoburg forest is the region’s cultural and intellectual hub. East Westphalia-Lippe is
Germany’s fifth-largest economic area and the region is home to two million people (taken from: http://www.campus-bielefeld.de/en/city-of-bielefeld/)

=================================================================

We would like to thank our commercial sponsors:

Premium Sponsors

Sponsors

=================================================================

SAGA 2013 Workshop Organising Committee:

Workshop Organisers:

  • Kai Essig
  • Thies Pfeiffer
  • Pia Knoeferle
  • Helge Ritter
  • Thomas Schack
  • Werner Schneider

All from the Cognitive Interaction Technology Center of Excellence at Bielefeld University

Scientific Board:

  • Thomas Schack
  • Helge Ritter
  • Werner Schneider

Jury of the Challenge:

  • Kai Essig
  • Thies Pfeiffer
  • Pia Knoeferle
  • Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:
http://saga.eyemovementresearch.com/about-saga/

For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the workshop organisers
Thies Pfeiffer

European Summer School on Eye Movements – Call for Participation
avatar

Ulrich Ettinger posted on the Eye-movement mailing list the following invitation to the European Summer School on Eye Movements:

The European Summer School on Eye Movements (ESSEM), funded by the 
VolkswagenStiftung and organised by Ulrich Ettinger (Bonn) and 
Christoph Klein (Freiburg/Bangor), will be held at the Department 
of Psychology of the University of Bonn, Germany, 9th to 13th 
September 2013.

ESSEM brings together internationally renowned researchers to teach 
students in the theories, neural bases, experimental designs and 
statistical analysis of eye movements studies in clinical settings 
(psychiatry and neurology), ergonomics and marketing research.

We are pleased to now invite applications from students and 
researchers at the PhD/MD or early post-doctoral level. Financial 
support is available for participants’ travel and accommodation 
expenses. The course fee is €200 and includes catering (lunch, 
coffee) throughout the summer school.

Participants are expected to present a poster of a study they have 
carried out (not necessarily involving oculographic methods) on the 
first day of ESSEM.

To apply, please send a cover letter with brief personal statement 
on why you wish to attend ESSEM, your current CV and the abstract 
for your poster presentation to essem.bonn@gmail.com. The deadline 
for applications is 31st May 2013.

Further information can be found at cognition.uni-bonn.de.

Best wishes,
Ulrich Ettinger (ulrich.ettinger@uni-bonn.de)
Christoph Klein (c.klein@bangor.ac.uk)

Call for Papers: Eye Tracking South Africa 2013
avatar

Tanya Beelders posted the following call for papers:

Eye Tracking South Africa, an international conference aimed 
specifically at eye tracking research will be launched this year 
in Cape Town, South Africa. The conference will take place 
from 29-31 August 2013. Interact 2013 will follow directly after 
ETSA and will also be held in Cape Town. This gives delegates the 
opportunity to attend two international conferences in a single trip.

Call for papers

Authors are invited to submit papers that relate to the theme 
“Eyes on the Mountain” although this should not be seen as a 
restrictive requirement. Conference tracks will include but 
not limited to the following topics:

    Usability
    Visualisation
    Gaze interaction
    Reading research
    Eye Control for people with disabilities
    Visual attention
    Systems, tools and methods
    Eye movements
    Technical aspects of eye-tracking e.g. pupil detection, 
        calibration, mapping, event detection, data quality, etc.

Deadline for full papers: 30 April 2013.

Deadline for short papers and posters (abstracts): 15 June 2013

Researchers or industry are also invited to present a symposium or 
workshop at the conference. A two page abstract must be submitted 
by 15 June 2013.

Industry sponsors will be given the opportunity to present their 
products. The deadline for industry workshops and demonstration 
is 30 June 2013.

Exhibition space is available for the duration of the conference.

Please visit http://www.eyetrackingsa.com for more information.

ETSA Organising Committee

Tanya Beelders

Pieter Blignaut

University of the Free State

Department of Computer Science and Informatics

ECVP 2013: Call for Abstracts
avatar

Dr. Wegener just posted the call for Symposia for ECVP:

*European Conference on Visual Perception (ECVP) 2013: 1st Call for 
Abstracts *

The 36th European Conference on Visual Perception (ECVP) will take place 
in Bremen, Germany, from August 25th to August 29th 2013.

Herewith we call for your contributions to ECVP 2013. We invite you to 
submit an abstract about your recent work on visual perception and 
related topics to present it on the conference - either as a talk or 
poster.
All abstracts will be reviewed. Notification of acceptance will be sent 
by June.
*The deadline for abstract submission is March 24, 2013. *

This year's ECVP has a special focus on Computational Neuroscience.
We encourage submissions on work at the node between Visual Perception 
and Computational Neuroscience, regarding techniques, methods, concepts, 
and models.

Please note that for submitting an abstract, you have to register for 
the main conference first (http://www.ecvp.uni-bremen.de/node/15).
After registering, you will receive a preliminary confirmation, and a 
link to the abstract submission system.
*Registration opens on January 21, 2013 (next Monday).*

Please note that there are two modifications this year regarding 
abstract submission:

 1.   If you apply for an oral presentation (talk) you can optionally
    include a max one-page PDF or RTF extended summary with additional
    information about your contribution.
 2. You are required to choose at least one topic and one method
    keyword, in order to assign all abstracts to appropriate reviewers
    and program sessions. The list of available keywords can be found on
    the website (abstract guidelines).


On Sunday, August 25th, 2013 we offer two additional events you might 
want to attend:

 1. Bernstein-Tutorials: Will take place before the main conference and
    shall give the opportunity for introducing students, postdocs but
    also experienced scientists to various important topics and
    state-of-the-art methods and techniques in Psychophysics, Data
    analysis and Computational Neurosciences.
 2. Satellite Symposium at HWK: The satellite symposium "The Art of
    Perception - The Perception of Art" will also be held on the opening
    day of the ECVP 2013.


Registration for satellite events (Bernstein tutorials, Art symposium 
etc.) is subject to space limitations and will be done on a first-come, 
first-served basis.
You have to register for satellite events during the normal registration 
process.

You can find all important dates, fees, guidelines and additional 
information at *http://www.ecvp.uni-bremen.de/*

Best regards and awaiting many interesting contributions,

ECVP 2013 team,

Udo Ernst | Cathleen Grimsen | Detlef Wegener | Agnes Janssen

-- ECVP 2013 Organizing Committee Udo Ernst | Cathleen Grimsen | Detlef Wegener | Agnes Janssen Universitaet Bremen / University of Bremen Zentrum fuer Kognitionswissenschaften / Center for Cognitive Sciences Hochschulring 18 28359 Bremen / Germany Website: www.ecvp.uni-bremen.de Facebook: www.facebook.com/EuropeanConferenceOnVisualPerception Contact - email: symp2013@ecvp.uni-bremen.de (For organization and submission of symposia) exhibition2013@ecvp.uni-bremen.de (For any query regarding the exhibition) contact2013@ecvp.uni-bremen.de (For any comments, questions or suggestions)