Call for Participation at SAGA Workshop 2013
avatar

Call for Participation at SAGA Workshop 2013

1st International Workshop on Solutions for Automatic Gaze Data Analysis and Eyetracking Studies in Natural Environments

Where: Bielefeld University, Germany
When: October 24th – 25th, 2013

Why you would want to participate:

If you are an experimental researcher:
– Does manual annotation impede your research?
– Do you want to analyse mobile eye tracking data?
– Do you want to move from desktop-based to more
natural interaction scenarios?

If you are a computer scientist:
– Are you interested in gaze-based interaction?
– Are you an expert in tracking objects or reconstructing scenes?
– Are you seeking for an interesting field of application?

Then you should not miss the SAGA Workshop on 24th to 25th of October 2013 at the new CITEC facilities at Bielefeld University (Germany):

The aim of the workshop is to build a bridge between basic academic research and applied research, particularly in the fields of visual image analysis, and scene representation (object recognition and tracking), as well a the online analysis and interpretation of attention in context mobile studies on natural scene perception.

We are providing a forum for researchers from human-computer interaction, context-aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos and use of eye tracking in natural environment studies.

Talks:

Several researchers will present their works on solutions for the (semi-) automatic annotation of gaze videos and on eye movement studies in natural environments as a trailblazer for gaze analysis in natural environments, mobile eye-based interaction and eye-based context-awareness.

Keynote speakers:

Several outstanding keynote speakers from academics and industry in the fields of eye movement research in natural environments, leading usability service provider and assistive technologies in professional training processes for disabled people, already confirmed their participation at the SAGA 2013 workshop:

– Marc Pomplun, UMASS Boston, United States of America
– Ben Tatler, University of Dundee, Scotland
– Michael Schiessl, Managing Director of EyeSquare
(User & Brand Research) in Berlin, Germany
– Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld,
Germany

Live Demo Sessions:

As a particular highlight, several technical solutions (commercial or research-in-progress) for automatic gaze analysis will be demonstrated inbetween sessions:
– VideoGazer – A Modular Approach Towards Automatic Annotation
of Gaze Videos
– Location-based Online Identification of Objects in the
Centre of Visual Attention using Eye Tracking
– Various object recognition and tracking solutions from the
Robotics Group of the Center of Excellence “Cognitive Interaction
Technology”, Bielefeld University, Germany
– BeGaze from Sensomotoric Instruments (SMI)
– Mobile Eye Tracking paired with a mobile EEG solution
– … and more to come

During the live session, researchers will provide an in-depth demonstration of their solutions and they are pleased to answer any further questions you may have.

For more information and registration, please visit the workshop website at http://saga.eyemovementresearch.com/

We are looking forward to meet you at SAGA!

Kai Essig & Thies Pfeiffer

2nd Call for Papers: 1st International Workshop on Solutions for Automatic Gaze Data Analysis (CITEC/Bielefeld University)
avatar

SAGA 2013: 1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS –
uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Workshop Website: http://saga.eyemovementresearch.com/

The SAGA 2013 workshop is accepting abstracts for two calls: Challenge Contributions as well as Oral Presentation or Posters.
We are currently pursuing possible options for publication of a special issue in a journal or as an edited volume.


Important Dates:

1. Oral presentation / poster call:

August, 23, 2013: Deadline for abstract submissions.
September, 6, 2013: Notification of acceptance for talks and posters.

2. Challenge:

August, 23, 2013: Deadline for 2-page preliminary abstract sketching your approach.
September, 6, 2013: Notification of acceptance for challenge.
October, 2, 2013: Submission of the final abstracts and final results.

October, 24-26, 2013: Workshop takes place at Bielefeld University, Germany.


Invited keynote speakers from academia and industry:

Marc Pomplun, UMASS Boston, United States of America
Ben Tatler, University of Dundee, Scotland
Michael Schiessl, Managing Director of EyeSquare (User & Brand Research) in Berlin, Germany
Andreas Enslin, Head of Miele Design Centre in Gütersloh, Germany
Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld, Germany


We are very pleased to publish this second call for challenge contributions and abstracts for SAGA 2013, the 1st International Workshop on Automatic Annotation of Gaze Videos. SAGA 2013 will focus on automatic solutions for the automatic annotation of gaze videos and research work on eye movement analysis in natural environments as a trailblazer for mobile eye-based interaction and eye-based context-awareness.

We are providing a forum for researchers from human-computer interaction, context-aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction.

We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos.


Submissions:

Abstracts will be peer-viewed by at least two members of an international program committee. Word and LaTex templates for the submissions are now available on the workshop website and the registration is open.

########################################
1. Oral presentation / poster call: #
########################################

We are calling for 500-word abstracts on topics related to real-world eye tracking and eye movement analyses. Possible topics include, but are not limited to, eye tracking in human-machine interaction, visual search, language processing, eye-hand coordination, marketing, automatized tasks, and decision making.

Please note: All accepted contributions must register for the workshop.

################
2. Challenge: #
################

In order to drive research on software solutions for the automatic annotation of gaze videos we offer a special challenge on this topic. The purpose of the challenge is to encourage the community to work on a set of specific software solutions and research questions and to continuously improve on earlier results obtained for these problems over the years.

We are providing a set of test videos on the workshop website (duration 2-3 minutes) and separate text files with the corresponding gaze data for which solutions for semi- and fully-automatic software solutions for the recognition and tracking of objects over the whole video sequence shall be written. The software should provide the coordinates for the tracked objects and use this information to automatically calculate object specific gaze data, such as number of fixations and cumulative fixation durations. There are no restrictions on the way in which the relevant objects are marked and on which kind of techniques can be used to track the objects.

The only constraint is that your software solution can read and process the provided videos and reports gaze specific data for the selected objects either as a text file (which can serve as input for a statistical program such as SPSS, Matlab, R oder MS Excel) or by providing some kind of visualization.


Detailed information on the participation and a description of all necessary steps can be found on the workshop website (see: http://saga.eyemovementresearch.com/challenge/howto-participate-in-the-challenge/). Additionally, you can now find an explanation of the manual annotation procedures which we will be used to evaluate the submitted software solutions for the challenge(http://saga.eyemovementresearch.com/challenge/videomaterial/).

In order to access this page, you first must register for the challenge (see: http://saga.eyemovementresearch.com/challenge/register-for-the-saga-challenge/).


All submissions will be evaluated by an independent jury according to the evaluation criteria (see Workshop Website). Additionally, there is a live session scheduled for the third day in which all selected solutions can be demonstrated to the interested workshop participants. The three best solutions will receive an award.

Prize money:

1. Price: 1,000,- €
2. Price: 500,- €
3. Price: 250,- €

We would like to thank our premium sponsor SensoMotoric Instruments (SMI) for the contribution of prize money and test videos recorded with SMI’s mobile eye tracking glasses (www.eyetracking-glasses.com).

We would also like to thank our sponsor Tobii Technologies for supporting the live demo workshop session and for the provided Tobii Glasses (http://www.tobii.com/en/eye-tracking-research/global/products/hardware/tobii-glasses-eye-tracker/) test videos.

Please Note: All challenge participants must register separately at http://saga.eyemovementresearch.com/challenge/register-for-the-saga-challenge/ for access to the challenge material and the video download.


SAGA 2013 Workshop Organising Committee:

Workshop Organisers: Kai Essig, Thies Pfeiffer, Pia Knoeferle, Helge Ritter, Thomas Schack and Werner Schneider. All from Bielefeld University, Germany
Scientific Board: Thomas Schack, Helge Ritter and Werner Schneider
Jury of the Challenge: Kai Essig, Thies Pfeiffer, Pia Knoeferle and Denis Williams (Sensomotoric Instruments, SMI).

Please visit the website periodically for updates (http://saga.eyemovementresearch.com/about-saga/)
For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to Bielefeld in October, 2013!

Call for Papers: The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
avatar

Roman Bednarik posted the following call for papers on the Eye-Movement mailing list:

Call for Papers:

The 6th Workshop on Eye Gaze in Intelligent Human Machine Interaction
at ACM ICMI 2013, Sydney, Australia

December 13, 2013
Papers deadline: August 31, 2013
www: http://cs.uef.fi/gazein2013

Invited speaker: Julien Epps – University of New South Wales, Sydney, Australia

Eye gaze is one of the most important aspects in understanding and modeling human-human communication, and it has great potential also in improving human-machine and robot interaction. In human face-to-face communication, eye gaze plays an important role in floor and turn management, grounding, and engagement in conversation. In human-computer interaction research, social gaze, gaze directed at an interaction partner, has been a subject of increased attention.

This is the sixth workshop in Eye Gaze in Intelligent Human Machine Interaction, and in the past we have discussed a wide range of issues for eye gaze relevant to multimodal interaction; technologies for sensing human attentional behaviors, roles of attentional behaviors as social gaze in human-human and human-machine/robot interaction, attentional behaviors in problem-solving and task-performing, gaze-based intelligent user interfaces, and evaluation of gaze-based UI. In addition to these topics, this workshop will focus on eye gaze in multimodal communication, interpretation and generation. Since eye gaze is one of the primary communication modalities, gaze information can be combined with other modalities to compensate meanings of utterances or to serve as a stand-alone communication signal.

GazeIn’13 aims to continue in these lines and explore the growing area of gaze in intelligent interaction research by bringing together researchers from domains of human sensing, multimodal processing, humanoid interfaces, intelligent user interfaces, and communication science. We will exchange ideas to develop and improve methodologies for this research area with the long-term goal of establishing a strong interdisciplinary research community in “attention aware interactive systems”.

This workshop solicits papers that address the following topics (but not limited to):

• Technologies and methods for sensing and interpretation of gaze and human attentional behaviors
• Eye gaze in multimodal generation and behavior production in conversational humanoids
• Empirical studies of attentional behaviors
• New directions for gaze in Multimodal interaction
• Evaluation and design issues for using eye gaze in multimodal interfaces

Please see the online CfP for a full list of topics (http://cs.uef.fi/gazein2013/call-for-papers)

SUBMISSION INFORMATION
There are two categories of paper submissions.
Long paper: The maximum length is 6 pages.
Short paper: The maximum length is 3 pages.

At least three members of the program committee will review each submission. The accepted papers will be published in the workshop proceedings. Best papers will be selected for an inclusion to a special issue in a journal. Submitted papers should conform to the ACM publication format. For templates and examples follow the link: http://www.acm.org/sigs/pubs/proceed/template.html

Please submit your papers using https://precisionconference.com/~icmi13j

IMPORTANT DATES
Paper submission due: August 31, 2013
Notification of acceptance: September 20, 2013
Camera-ready due: October 10, 2013
Workshop date: December 13, 2013

ORGANIZERS
Roman Bednarik – University of Eastern Finland, Finland
Hung-Hsuan Huang – Ritsumeikan University, Japan
Kristiina Jokinen – University of Helsinki, Finland
Yukiko Nakano – Seikei University, Japan


————————————————————————
Roman Bednarik http://cs.uef.fi/~rbednari
School of Computing, University of Eastern Finland
————————————————————————

Call for Papers: ETRA 2014
avatar

The call for the ETRA 2014 is out:

EYE TRACKING RESEARCH & APPLICATIONS SYMPOSIUM – ETRA 2014

http://www.etra2014.org

March 26th – 28th, 2014 Safety Harbor Resort & Spa Safety Harbor, FL,
USA

1ST CALL FOR PAPERS

The eighth ACM Symposium on Eye Tracking Research & Applications (ETRA
2014) will be held in Safety Harbor, Florida, on March 26th-28th,
2014. The ETRA conference series focuses on eye movement research and
applications across a wide range of disciplines. The symposium
presents research that advances the state-of-the-art in these areas,
leading to new capabilities in gaze tracking systems, gaze aware
applications, gaze based interaction, eye movement data analysis, etc.
For ETRA 2014, we invite papers in all areas of eye tracking research
and applications.

IMPORTANT DATES

20 Sept 2013: Paper abstracts due
4 Oct 2013: Full & short papers due
8 Nov 2013: Paper acceptance
29 Nov 2013: Paper revisions due
20 Dec 2013: Final paper acceptance due
6 Jan 2014: Doctoral Symposium submission due Video & Demo submission due
24 Jan 2014: Doctoral Symposium, video & demo acceptance due
31 Jan 2014: Camera ready papers due

RESEARCH AREAS OF INTEREST

*Eye Tracking Technology* Advances in eye tracking hardware, software
and algorithms such as: 2D and 3D eye tracking systems, calibration,
low cost eye tracking, natural light eye tracking, predictive models,
etc.

*Eye Tracking Data Analysis* Methods, procedures and analysis tools
for processing raw gaze data as well as fixations and gaze patterns.
Example topics are: scan path analysis, fixation detection algorithms,
and visualization techniques of gaze data.

*Visual Attention and Eye Movement Control* Applied and experimental
studies investigating visual attention and eye movements to gain
insight in eye movement control, cognition and attention, or for
design evaluation of visual stimuli. Examples are: usability and web
studies using eye tracking, and eye movement behavior in everyday
activities such as driving and reading.

*Eye Tracking Applications* Eye tracking as a human-computer input
method, either as a replacement to traditional input methods or as a
complement. Examples are: assistive technologies, gaze enhanced
interaction and interfaces, multimodal interaction, gaze in augmented
and mixed reality systems, gaze-contingent displays and gaze-based
biometric applications.

SUBMISSION CATEGORIES

*Research papers:* Authors are invited to submit original work in the
formats of Full paper (8 pages) and Short paper (4 pages). The papers
will undergo a rigorous review process assessing the originality and
quality of the work as well as the relevance for eye tracking research
and applications. Papers presented at ETRA 2014 will be available in
the ACM digital library. Submission formats and instructions are
available at the conference web site.

IMPORTANT NOTE: The submission process is different from past years,
in that there is a single deadline for both long and short papers.
Given the outcome of the first review process, papers may be invited
to resubmit for an additional review. In some cases, full papers may
be offered to resubmit their paper as a short paper.

*Doctoral Symposium (NEW to ETRA 2014):* ETRA 2014 is proud to
introduce the ETRA Doctoral Symposium, where graduate students get an
opportunity to meet other students and experienced researchers to get
feedback on their research in a friendly environment. We invite
doctoral students, who have a defined topic in the area of eye
tracking research and applications, but whose work is still in a phase
where it can be influenced by the feedback received in the symposium.
Participants will be selected based on a 2-page extended abstract
describing the thesis work and its current status.

*Demo/video track(NEW to ETRA 2014):* Have a gaze interaction
technique to share? Or want to show off how your new you tracking
method works? At ETRA 2014, we are adding a demo/video session where
researchers give demonstrations of their research or show videos of
their work. To take part in this session, we request a 2-page extended
abstract to be submitted. If authors have a full or short paper
accepted, no extended abstract is needed. If submitting for a video
presentation, the video is required.

CONFERENCE VENUE

ETRA 2014 will be held at the historic Safety Harbor Resort and Spa in
Safety Harbor Florida, a resort hotel sitting on top of three natural
springs facing the Tampa Bay, located on the beautiful Florida Golf
coast.

SPONSORSHIP

ETRA 2014 is co-sponsored by the ACM Special Interest Group in
Computer-Human Interaction (SIGCHI), and the ACM Special Interest
Group on Computer Graphics and Interactive Techniques (SIGGRAPH).

CONFERENCE CO-CHAIRS Dan Witzner Hansen, IT University Denmark
Pernilla Qvarfordt, FX Palo Alto Laboratory, Inc.

PROGRAM CO-CHAIRS Joe Goldberg – Oracle, Inc. Jeffrey B. Mulligan –
NASA, USA

DOCTORAL SYMPOSIUM CO-CHAIRS Päivi Majaranta, Univesity of Tampere,
Finland Jeff B. Pelz, University of Rochester, USA

DEMO/VIDEO CHAIR Oleg Komogortsev, Texas State University, USA

Call for Contributions: Challenge on Automatic Object Identification (AOI) and Tracking
avatar

1. Call for Challenge on Automatic Object Identification (AOI) and Tracking

as part of the 

SAGA 2013:
1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS
 - uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence 

Workshop Website:
http://saga.eyemovementresearch.com/challenge/

===========================================================================

Important Dates:

August 15th, 2013:   Deadline for 2-page abstract sketching your
                     approach.
September 2nd, 2013: Notification of acceptance for challenge.
October 2nd, 2013:   Submission of the final abstracts and final 
                     results.

October 24-26, 2013: Challenge results presentation takes place at the 
                     SAGA 2013 Workshop at Bielefeld University, 
                     Germany.

===========================================================================

We are very pleased to publish this call for challenge contributions
as part of the SAGA 2013 1st International Workshop on Solutions for
Automatic Gaze Data Analysis. The challenge will focus on software
solutions for automatic object recognition as a trailblazer for
vision-based object and person tracking algorithms. The automatic
object or person recognition and tracking in video sequences (in real-
time) is a key condition for many application fields, such as mobile
service robotics, Human-Robot Interaction (HRI), Computer Vision,
Digital Image Processing, autonomous assistance and surveillance
systems (e.g., driver assistance systems) and Eye Tracking.
Applications vary from tracking of objects (e.g., manipulating or
recognition of objects in dynamic scenes), body parts (e.g., head or
hand tracking for mimic and gesture classification), and persons
(e.g., person reidentification or visual following).

Although, many efficient tracking methods have been introduced for
different tasks over the last years, they are mostly restricted
towards particular environmental settings and therefore cannot be
applied to general application fields. This is due to a range of
factors: 1.) Often, underlying assumptions about the environment
cannot be met, including static background, no changes in lighting and
inhomogeneous or invariant appearances. These idealized conditions are
usually missing for object tracking in high dynamic environments, as
they are common, for example in mobile scenarios. 2.) Object models
cannot be applied because of the high variance in the appearance of
tracked persons or objects. 3.) Most algorithms are computationally
quite expensive (large systems demand often hard computational
restrictions for the used algorithms).

===========================================================================

Details on the SAGA 2013 
CHALLENGE on Automatic Object Identification (AOI) and Tracking:

In order to drive research on software solutions for the automatic
annotation of videos we offer a special challenge on this topic.
The purpose of the challenge is to encourage the community to work on a
set of specific software solutions and research questions and to
continuously improve on earlier results obtained for these problems over
the years. This will hopefully not only push the field as a whole and
increase the impact of work published in it, but also contribute open
source hardware, methods and data analysis software back to the
community. 

For the challenge we adress this topic on the basis of eye-tracking
data. Therefore, we are providing a set of test videos (duration 2-3
minutes) and separate text files with the corresponding gaze data on
the workshop website for which solutions should be written. These gaze
videos, recorded by a scene camera attached to an eye-tracking system,
show people when they look at objects or interact with them in mobile
applications. The gaze data contains a time-stamped list of x- and y-
positions of the gaze points (in the coordinate system of the scene
video). For selected videos, frame counter information will be also
available to assist with synchronization of the video and the gaze
data.

For the challenge we are looking for semi- and fully-automatic
software solutions for the recognition and tracking of objects over
the whole video sequence. The software should provide the coordinates
for the tracked objects and use this information to automatically
calculate object specific gaze data, such as number of fixations and
cumulative fixation durations, by using the time-stamped list of 2D
gaze coordinates in the eye-tracking file. There are no restrictions
on the way in which the relevant objects are marked and on which kind
of techniques can be used to track the objects. The only constraint is
that your software solution can read and process the provided videos
and reports gaze specific data for the selected objects either as a
text file (which can serve as input for a statistical program such as
SPSS, Matlab, R oder MS Excel) or by providing some kind of
visualization.

All submissions will be evaluated by an independent jury according to
the evaluation criteria (see below). Additionally, there is a live
session scheduled for the third day in which all selected solutions
can be demonstrated to the interested workshop participants. The three
best solutions will receive an award.

Prize money:

1. Price: 1.000,- €
2. Price:   500,- €
3. Price:   250,- €

We would like to thank our premium sponsor SensoMotoric Instruments
(SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices
from
- SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
- Tobii Technologies [Tobii Glasses]
- Applied Science Laboratories (ASL)
  / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]

===========================================================================

Submissions:

In order to allow for more time for the implementation process for the
challenge a two-step submission procedure has been devised. The decision
for acceptance to the challenge will be on a preliminary submitted
abstract. The final evaluation and ranking of the software solutions
will be based on the final abstract and the final results for a test-set
of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract
describing the implementation details of your proposed software solution
including the following:

- description of the underlying techniques and implementations
- description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3
page paper by adding the following details:

- number of fixations and cumulative fixation duration details for the
  specified objects
- performance data (such as computation time, number of selected
  objects, parallel tracking of several objects in the scene)
- snapshot of the results

We will use results based on manual annotation to evaluate the submitted
results. The following evaluation criteria will be applied:

- quality of the automated benchmark results (region and pixel based)
  compared to the results given by manual annotation
- conceptual innovation
- performance (such as computation time, number of selected objects,
  parallel tracking of several objects in the scene)
- robustness (such as such as tracking performance, general scope of
  the application)
- usability

The test videos and a corresponding description of them can be found on
the workshop website. Additionally, you can find a detailed description
of how we perform the manual annotation. The exact description for the
challenge, including the evaluation criteria and the required format for
the results, will appear on the workshop website within the next 3
weeks. Please check the website regularly for updates.

Abstracts will be peer-viewed by at least two members of an
international program committee. We will provide templates on the
workshop website. We are currently pursuing possible options for
publication of a special issue in a journal or as an edited volume.

Please Note: All challenge participants must register separately for
access to the challenge material and the video download.

===========================================================================

We would like to thank our commercial sponsors:

Premium Sponsors
- SensoMotoric Instruments (SMI) [challenge]
  / SMI Eye Tracking Glasses (www.eyetracking-glasses.com)

Sponsors
- Tobii Technologies [live demo workshop session]
  / Tobii Glasses (http://www.tobii.com/en/eye-tracking-
  research/global/products/hardware/tobii- glasses-eye-tracker/)

===========================================================================

Challenge Organising Committee:

Workshop Organisers:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Helge Ritter
- Thomas Schack
- Werner Schneider

All from the
Cognitive Interaction Technology Center of Excellence
at Bielefeld University

Scientific Board:
- Thomas Schack
- Helge Ritter
- Werner Schneider

Jury of the Challenge:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:
http://saga.eyemovementresearch.com/about-saga/

For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the organisers

Thies Pfeiffer

Call for Papers: 1st International Workshop on Solutions for Automatic Gaze Data Analysis (CITEC/Bielefeld University)
avatar

1. Call for papers and challenge contributions:

SAGA 2013:
1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS
– uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence Workshop Website:
http://saga.eyemovementresearch.com/

The SAGA 2013 workshop is accepting abstracts for two calls: Challenge
Contributions as well as Oral Presentation or Posters. We are currently
pursuing possible options for publication of a special issue in a
journal or as an edited volume.

=================================================================

Important Dates:

1. Oral presentation / poster call:

August, 15, 2013: Deadline for abstract submissions.
September, 2, 2013: Notification of acceptance for talks and posters.

2. Challenge:

August, 15, 2013: Deadline for 2-page abstract sketching your approach.
September, 2, 2013: Notification of acceptance for challenge.
October, 2, 2013: Submission of the final abstracts and final results.

October, 24-26, 2013: Workshop takes place at Bielefeld University,
Germany.

=================================================================

Invited keynote speakers from academia and industry:

– Marc Pomplun, UMASS Boston, United States of America
– Ben Tatler, University of Dundee, Scotland
– Michael Schiessl, Managing Director of EyeSquare in Berlin, Germany
– Andreas Enslin, Head of Miele Design Centre in Gütersloh, Germany
– Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld,
Germany

=================================================================

We are very pleased to publish this call for challenge contributions and abstracts for SAGA 2013, the 1st International Workshop on Automatic Annotation of Gaze Videos. SAGA 2013 will focus on automatic solutions for gaze videos as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We are providing a forum for researchers from human-computer interaction, context- aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos.

=================================================================

SAGA 2013 CHALLENGE:

In order to drive research on software solutions for the automatic annotation of gaze videos we offer a special challenge on this topic. The purpose of the challenge is to encourage the community to work on a set of specific software solutions and research questions and to continuously improve on earlier results obtained for these problems over the years. This will hopefully not only push the field as a whole and increase the impact of work published in it, but also contribute open source hardware, methods and gaze data analysis software back to the community. We are providing a set of test videos on the workshop website for which solutions should be written. All submissions will be evaluated by an independent jury according to the evaluation criteria (see below). Additionally, there is a live session scheduled for the third day in which all selected solutions can be demonstrated to the interested workshop participants. The three best solutions will receive an award.

Prize money:

1. Price: 1,000,- €
2. Price: 500,- €
3. Price: 250,- €

We would like to thank our premium sponsor SensoMotoric Instruments (SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices from

  • SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
  • Tobii Technologies [Tobii Glasses]
  • Applied Science Laboratories (ASL) / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]

=================================================================

Submissions:

Abstracts will be peer-viewed by at least two members of an international program committee. We will provide templates on the workshop website.

########################################
# 1. Oral presentation / poster call:
########################################

We are calling for 500-word abstracts on topics related to real-world eye tracking and eye movement analyses. Possible topics include, but are not limited to, eye tracking in human-machine interaction, visual search, language processing, eye-hand coordination, marketing, automatized tasks, and decision making.

Please note: All accepted contributions must register for the workshop.

################
# 2. Challenge:
################

We will provide test videos (duration 2-3 minutes) and separate text files with the corresponding gaze data on the workshop website. The gaze data consists of a timestamped list of (x,y) gaze scene video coordinates. For selected videos, frame counter information will be also available to assist with synchronization of the video and the gaze data. For the challenge we are looking for semi- and fully-automatic software
solutions for the recognition and tracking of objects over the whole video sequence. The software should provide the coordinates for the tracked objects and use this information to automatically calculate object specific gaze data, such as number of fixations and cumulative fixation durations. There are no restrictions on the way in which the relevant objects are marked and on which kind of techniques can be used to track the objects. The only constraint is that your software solution can read and process the provided videos and reports gaze specific data for the selected objects either as a text file (which can serve as input for a statistical program such as SPSS, Matlab, R oder MS Excel) or by
providing some kind of visualization. In order to allow for more time for the implementation process for the challenge a two-step submission procedure has been devised. The decision for acceptance to the challenge will be on a preliminary submitted abstract. The final evaluation and ranking of the software solutions will be based on the final abstract and the final results for a test-set of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract describing the implementation details of your proposed software solution including the following:

  • description of the underlying techniques and implementations
  • description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3page paper by adding the following details:

  • number of fixations and cumulative fixation duration details for the specified objects
  • performance data (such as computation time, number of selected objects, parallel tracking of several objects in the scene)
  • snapshot of the results

We will use results based on manual annotation to evaluate the submitted results. The following evaluation criteria will be applied:

  • quality of the automated benchmark results (region and pixel based) compared to the results given by manual annotation
  • conceptual innovation
  • performance (such as computation time, number of selected objects, parallel tracking of several objects in the scene)
  • robustness (such as such as tracking performance, general scope of the application)
  • usability

The test videos and a corresponding description of them can be found on the workshop website. Additionally, you can find a detailed description of how we perform the manual annotation. The exact description for the challenge, including the evaluation criteria and the required format for the results, will appear on the workshop website within the next 3 weeks. Please check the website regularly for updates.

Please Note: All challenge participants must register separately for access to the challenge material and the video download.

=================================================================

Location:

The SAGA 2013 workshop will be held at the new CITEC Research Building Interactive Intelligent Systems, which is located close to the main building of Bielefeld University. The construction of the research building ‘Interactive Intelligent Systems’ started in January 2011 and the completion will be in summer 2013. By end of the year, the research building will host 17 research groups from various disciplines such as informatics, engineering, linguistics, psychology, and sports science. It will be completed by a conference center built to accommodate up to 200 participants that has been planned as an internationally visible hallmark of this highly profiled research site (taken from: https://www.cit-ec.de/FBIIS).

Bielefeld University was founded in 1969 with an explicit research assignment and a mission to provide high-quality research-oriented teaching. Today it encompasses 13 faculties covering a broad spectrum of disciplines in the humanities, natural sciences, social sciences, and technology. With about 18,500 students in 80 degree courses and 2,600 staff (including approx. 1,480 academic staff), it is one of Germany’s best known medium-sized universities.

Bielefeld, the centre for science, is the economic and cultural capital of the East Westphalia economic area. The city of Bielefeld is one of the twenty largest cities in Germany, with a population of 325,000. This lively university city on the edge of the Teutoburg forest is the region’s cultural and intellectual hub. East Westphalia-Lippe is
Germany’s fifth-largest economic area and the region is home to two million people (taken from: http://www.campus-bielefeld.de/en/city-of-bielefeld/)

=================================================================

We would like to thank our commercial sponsors:

Premium Sponsors

Sponsors

=================================================================

SAGA 2013 Workshop Organising Committee:

Workshop Organisers:

  • Kai Essig
  • Thies Pfeiffer
  • Pia Knoeferle
  • Helge Ritter
  • Thomas Schack
  • Werner Schneider

All from the Cognitive Interaction Technology Center of Excellence at Bielefeld University

Scientific Board:

  • Thomas Schack
  • Helge Ritter
  • Werner Schneider

Jury of the Challenge:

  • Kai Essig
  • Thies Pfeiffer
  • Pia Knoeferle
  • Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:
http://saga.eyemovementresearch.com/about-saga/

For additional question, please contact: saga@eyemovementresearch.com

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the workshop organisers
Thies Pfeiffer

Call for Papers: Eye Tracking South Africa 2013
avatar

Tanya Beelders posted the following call for papers:

Eye Tracking South Africa, an international conference aimed 
specifically at eye tracking research will be launched this year 
in Cape Town, South Africa. The conference will take place 
from 29-31 August 2013. Interact 2013 will follow directly after 
ETSA and will also be held in Cape Town. This gives delegates the 
opportunity to attend two international conferences in a single trip.

Call for papers

Authors are invited to submit papers that relate to the theme 
“Eyes on the Mountain” although this should not be seen as a 
restrictive requirement. Conference tracks will include but 
not limited to the following topics:

    Usability
    Visualisation
    Gaze interaction
    Reading research
    Eye Control for people with disabilities
    Visual attention
    Systems, tools and methods
    Eye movements
    Technical aspects of eye-tracking e.g. pupil detection, 
        calibration, mapping, event detection, data quality, etc.

Deadline for full papers: 30 April 2013.

Deadline for short papers and posters (abstracts): 15 June 2013

Researchers or industry are also invited to present a symposium or 
workshop at the conference. A two page abstract must be submitted 
by 15 June 2013.

Industry sponsors will be given the opportunity to present their 
products. The deadline for industry workshops and demonstration 
is 30 June 2013.

Exhibition space is available for the duration of the conference.

Please visit http://www.eyetrackingsa.com for more information.

ETSA Organising Committee

Tanya Beelders

Pieter Blignaut

University of the Free State

Department of Computer Science and Informatics

ECVP 2013: Call for Abstracts
avatar

Dr. Wegener just posted the call for Symposia for ECVP:

*European Conference on Visual Perception (ECVP) 2013: 1st Call for 
Abstracts *

The 36th European Conference on Visual Perception (ECVP) will take place 
in Bremen, Germany, from August 25th to August 29th 2013.

Herewith we call for your contributions to ECVP 2013. We invite you to 
submit an abstract about your recent work on visual perception and 
related topics to present it on the conference - either as a talk or 
poster.
All abstracts will be reviewed. Notification of acceptance will be sent 
by June.
*The deadline for abstract submission is March 24, 2013. *

This year's ECVP has a special focus on Computational Neuroscience.
We encourage submissions on work at the node between Visual Perception 
and Computational Neuroscience, regarding techniques, methods, concepts, 
and models.

Please note that for submitting an abstract, you have to register for 
the main conference first (http://www.ecvp.uni-bremen.de/node/15).
After registering, you will receive a preliminary confirmation, and a 
link to the abstract submission system.
*Registration opens on January 21, 2013 (next Monday).*

Please note that there are two modifications this year regarding 
abstract submission:

 1.   If you apply for an oral presentation (talk) you can optionally
    include a max one-page PDF or RTF extended summary with additional
    information about your contribution.
 2. You are required to choose at least one topic and one method
    keyword, in order to assign all abstracts to appropriate reviewers
    and program sessions. The list of available keywords can be found on
    the website (abstract guidelines).


On Sunday, August 25th, 2013 we offer two additional events you might 
want to attend:

 1. Bernstein-Tutorials: Will take place before the main conference and
    shall give the opportunity for introducing students, postdocs but
    also experienced scientists to various important topics and
    state-of-the-art methods and techniques in Psychophysics, Data
    analysis and Computational Neurosciences.
 2. Satellite Symposium at HWK: The satellite symposium "The Art of
    Perception - The Perception of Art" will also be held on the opening
    day of the ECVP 2013.


Registration for satellite events (Bernstein tutorials, Art symposium 
etc.) is subject to space limitations and will be done on a first-come, 
first-served basis.
You have to register for satellite events during the normal registration 
process.

You can find all important dates, fees, guidelines and additional 
information at *http://www.ecvp.uni-bremen.de/*

Best regards and awaiting many interesting contributions,

ECVP 2013 team,

Udo Ernst | Cathleen Grimsen | Detlef Wegener | Agnes Janssen

-- ECVP 2013 Organizing Committee Udo Ernst | Cathleen Grimsen | Detlef Wegener | Agnes Janssen Universitaet Bremen / University of Bremen Zentrum fuer Kognitionswissenschaften / Center for Cognitive Sciences Hochschulring 18 28359 Bremen / Germany Website: www.ecvp.uni-bremen.de Facebook: www.facebook.com/EuropeanConferenceOnVisualPerception Contact - email: symp2013@ecvp.uni-bremen.de (For organization and submission of symposia) exhibition2013@ecvp.uni-bremen.de (For any query regarding the exhibition) contact2013@ecvp.uni-bremen.de (For any comments, questions or suggestions)

ECVP 2013: Call for Symposia
avatar

Dr. Wegener just posted the call for Symposia for ECVP:

ECVP 2013: Call for Symposia 

The 36th European Conference on Visual Perception (ECVP) will take 
place in Bremen, Germany, from August 25th to August 29th 2013.

ECVP features a number of user-organized symposia that provide a
broad, but coherent overview about the state-of-the-art of a given
topic. We highly encourage submissions of symposium proposals until 
December 31st 2012. We particularly also like to motivate young
investigators to take the chance of organizing a symposium at this 
particularly pleasant and important conference.

Symposia will have a total length of 2 hours. Individual talks should 
relate to each other and should be suited to also provoke discussion, 
but number and length of individual talks may be freely chosen. 
Symposia should be introduced by the organizer or another selected 
speaker to motivate both the general framework of the topic and the 
symposium's outline to the audience, and ideally should have a 
summary or prepared discussion at the end. Symposia are particularly 
beneficial if they aim at a diversity of ideas and people and should 
not restrict on single groups or 'schools'.

To keep registration fees at a reasonable and low level, there is
traditionally no extra financial support for symposia. Speakers are 
expected to register as normal participants to the conference. 

If you submit a proposal, please mail to
<mailto:symp2013@ecvp.uni-bremen.de> symp2013@ecvp.uni-bremen.de and
include:

1. Organizer's address with affiliation, email and phone number.
2. The title of the proposed symposium and (not more than) one page
   that should clearly state the motivation, the aim and recurrent 
   theme of the symposium. 
3. List of speakers and the topic of contribution, and mention of
   whether they were contacted, have accepted, etc.

All proposals will be reviewed by the program committee. Notification 
of symposium acceptance will be sent by January. You can find 
additional information at http://www.ecvp.uni-bremen.de/

In case of acceptance we will need:

- Summary of symposium (200 words) for use in printed material.
- List of agreed speakers, affiliations, email and mailing
  addresses.
- Temporal structure of the symposium with exact sequence of talks.
  Include time for discussion and questions.
- Abstracts of individual talks (each 200 words)
- Special requests (audio-video, etc.)

Best regards and awaiting many interesting proposals,

ECVP 2013 team,

Udo Ernst | Cathleen Grimsen | Detlef Wegener | Agnes Janssen

Dr. Detlef Wegener
Brain Research Institute
Center for Cognitive Science
University of Bremen
Fon: +49-421-218 63007
Fax: +49-421-218 63012

17th European Conference on Eye Movements (ECEM) 2013 – Call is out
avatar

The organizing committee of the ECEM 2013 just posted the following call for abstracts:

17th EUROPEAN CONFERENCE ON EYE MOVEMENTS

11-16 August 2013 Lund, Sweden

We are very pleased to publish this call for abstracts for ECEM 2013,
which promises a number of innovations alongside the continuing high
scientific standards of the worlds largest conference on eye movement
research. ECEM 2013 is the 17th European Conference on Eye Movements,
with the original aims of the very first ECEM, /’to exchange information
on current research, equipment and software’/, at the forefront. In
2013, it is held in Lund, Sweden, organized by the eye-tracking group at
Lund Humanities Laboratory <http://www.humlab.lu.se/en/>together with
COGAIN <http://www.cogain.org/>, the non-profit association for
communication through gaze based interaction. ECEM 2013 is the first
conference organized under the auspices of the Eye Movement Researchers
Association (EMRA), a recently formed not-for-profit organization
facilitating shared tools for conference organization and management,
shared publications such as the Journal of Eye Movement Research (JEMR)
and shared tools for eye movement research and analysis. We look forward
to welcoming you to Lund in 2013.

/The ECEM 2013 Organising Committee/

  • Conference: *11th to 16th August 2013

Organising committee

Conference Chairs: *Kenneth Holmqvist* and *Arantxa Villanueva*
Conference Organiser: *Fiona Mulvey*
Scientific Board: *Halszka Jarodzka, Ignace Hooge, Rudolf Groner, Ulrich
**Ansorge*and *Päivi Majaranta*
Exhibition Chairs: *John Paulin Hansen* and *Richard Andersson*
Method Workshop Organisers: *Marcus Nyström* and *Dan Witzner Hansen.*
Web Masters: *Nils Holmberg* and *Detlev Droege*
Proceedings Editors: *Roger Johansson* and *Richard Dewhurst*
Registration Managers: *Kerstin Gidlöf* and *Linnéa Larsson*
Student Volunteer Managers: *Linnéa Larsson, Richard Dewhurst* and
*Kerstin Gidlöf*
Social Program Organisers: *Richard Andersson, Jana Holsanova* and
*Kerstin Gidlöf*

Important dates

  • Jan 15^th 2013*: Deadline for proposals for symposia.
  • Feb 25^th 2013*: Notification on acceptance for symposia
  • March 1^st 2013*: Deadline for 2 page abstracts for talks and 200 wordabstracts for posters.
  • April 15^th 2013*: Notification on acceptance for talks and posters.

Invited speakers

  • Daniel Richardson ?Alistair Gale ?Susanna Martinez-Conde
  • Kari-Jouko Räihä ?Alan Kingstone ?Carlos Morimoto
  • Simon Liversedge ?Douglas Munoz ?Thomas Haslwanter

Location

ECEM 2013 will be held at Lund University, Sweden (for more info, see
here <http://www.lunduniversity.lu.se>). This medieval town was founded
circa 990 and Lund University is the second oldest University in Sweden.
The times higher education world university rankings for 2011-2012 place
Lund University in the top 100 list in the world. It is placed 45th in
the world for Life Sciences and 80th in the world overall;

/”Humour, innovation and a humanist perspective join critical thinking
and a concern for the environment as core values for some 46,000
students and 6,200 staff at Lund. It was founded in
1666 and is now one of the largest Nordic educational and research
institutions.”
The Times Higher Education Report 2012/

Lund is a quaint academic town with around 111 000 inhabitants, and
roughly a third of the population are students or employees at Lund
University. The close proximity to Copenhagen Airport makes this an
ideal venue for international conferences. All conference activities
including the scientific, exhibition and social programme, will take
place in the historic student union castle (shown left), located in
Lundagård, at the heart of campus and the town. For details of
*accommodation* and special rates for delegates, see here
<http://ecem2013.eye-movements.org/registration/accommodation>.

New to ECEM in 2013

We aim to have four *panel discussions*during the conference. When
online registration opens, you will be asked to vote for topics. If you
have a topic relevant for a broad spectrum of eye movement researchers
which you would like to see discussed by a panel of experts, please
forward your ideas to ECEM management here
<mailto:management@ecem2013.eye-movements.org>. We will also include
*student competitions* for best talk, best poster, and best *student
software applications* using eye movements. *Prizes* for student
contributions include a commercial eye tracker, awarded for best new
application of eye movement data.

 

Information for Symposia

Each *symposium*will be comprised of four to six talks of 20 min each
(15 min + 5 min discussion). We strongly recommend that the final slot
be used as a summary discussion. Symposia should relate to the study of
eye movements and eye tracking from a psychological, neurobiological,
clinical, computational or applied perspective. Each symposium should
provide a large view of one relevant topic and should provoke
discussion, representing alternative theoretical views or alternative
approaches rather than a single school. A Symposium submission consists
of a 200-word proposal stating how the individual talks are related and
clarifying the benefit of a joint presentation. Symposia chairs ensure
that each first author submits her/his proposal through the review
system for talks, with 2-page extended abstract (see information on
submitting talks below). In the case that one of the suggested
presentations of a symposium fails to pass the review process or is
offered as a poster presentation, the scientific panel may suggest an
alternative presentation from the pool of accepted talks. Submit your
symposium proposal *here*
<http://ecem2013.eye-movements.org/submissions>**after November 1^st ,
2012. We will acknowledge receipt of your submission within 48 hours.
Notification of symposium acceptance will be sent by the end of February.

Information for Talks

Each *talk*will be 20 min (15 min + 5 min discussion). *Talks*should
present high quality, original empirical research either completed or
work in progress. Any work related to the study of eye movements and eye
tracking from a psychological, neurobiological, clinical, computational
or applied perspective can be submitted as an oral presentation.
Extended abstracts of 2 pages, including a 200 word abstract, a method,
results/preliminary results and conclusions section and a maximum of one
figure and one table per extended 2-page abstract. You can download
templates in MS word, LaTeX, and open office formats *here*
<http://ecem2013.eye-movements.org/submissions>. All submissions will be
peer reviewed. We cannot guarantee that we will respect the authors’
preferences for oral or poster presentation. Depending on the number of
submissions we receive, some talks may be offered as poster
presentations. Notification of abstract acceptance for talks will be
sent April 15. Submit your extended abstract *here*
<http://ecem2013.eye-movements.org/submissions>after November 1^st ,
2012. We will acknowledge receipt of your submission within 48 hours.
Notification of talk acceptance will be sent by April 15^th .

Information for Poster Presentations

*Posters***may present research in progress or in preparation.
Submission for *posters* will be 200 word abstracts, that will be peer
reviewed and a decision for acceptance or rejection will be sent by
April 15. Submit your poster abstract *here*
<http://ecem2013.eye-movements.org/submissions>after Nov 1st, 2012.

Review Policy

Please note that we will allow each conference participant to be *first
author on* *one talk only*, but each participant can submit as co-author
on as many oral presentations as he/she wishes, and as first author on
as many posters as he/she wishes. First authors should be present during
the conference and present their own work. Co-authors are not required
to be present. The review panel is drawn from across a broad range of
relevant expertise. Author- and reviewer-supplied keywords will
facilitate the review-assignment process. The members of the scientific
panel will oversee all review procedures and make the final decision on
acceptance. All symposium papers will go through the same peer review
process as other talks. Accepted talks are given the option to submit
full papers in the Journal of Eye Movement Research, if they wish. For
details on all these issues, see http://ecem2013.eye-movements.org/.