Call for Papers: 1st International Workshop on Solutions for Automatic Gaze Data Analysis (CITEC/Bielefeld University)

1. Call for papers and challenge contributions:

SAGA 2013:
– uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence Workshop Website:

The SAGA 2013 workshop is accepting abstracts for two calls: Challenge
Contributions as well as Oral Presentation or Posters. We are currently
pursuing possible options for publication of a special issue in a
journal or as an edited volume.


Important Dates:

1. Oral presentation / poster call:

August, 15, 2013: Deadline for abstract submissions.
September, 2, 2013: Notification of acceptance for talks and posters.

2. Challenge:

August, 15, 2013: Deadline for 2-page abstract sketching your approach.
September, 2, 2013: Notification of acceptance for challenge.
October, 2, 2013: Submission of the final abstracts and final results.

October, 24-26, 2013: Workshop takes place at Bielefeld University,


Invited keynote speakers from academia and industry:

– Marc Pomplun, UMASS Boston, United States of America
– Ben Tatler, University of Dundee, Scotland
– Michael Schiessl, Managing Director of EyeSquare in Berlin, Germany
– Andreas Enslin, Head of Miele Design Centre in Gütersloh, Germany
– Ellen Schack, v. Bodelschwinghian Foundations of Bethel in Bielefeld,


We are very pleased to publish this call for challenge contributions and abstracts for SAGA 2013, the 1st International Workshop on Automatic Annotation of Gaze Videos. SAGA 2013 will focus on automatic solutions for gaze videos as a trailblazer for mobile eye-based interaction and eye-based context-awareness. We are providing a forum for researchers from human-computer interaction, context- aware computing, robotics, computer vision and image processing, psychology, sport science, eye tracking and industry to discuss techniques and applications that go beyond classical eye tracking and stationary eye-based interaction. We want to stimulate and explore the creativity of these communities with respect to the implications, key research challenges, new techniques and application areas. The long-term goal is to create a strong interdisciplinary research community linking these fields together and to establish the workshop as the premier forum for research on automatic annotation of gaze videos.



In order to drive research on software solutions for the automatic annotation of gaze videos we offer a special challenge on this topic. The purpose of the challenge is to encourage the community to work on a set of specific software solutions and research questions and to continuously improve on earlier results obtained for these problems over the years. This will hopefully not only push the field as a whole and increase the impact of work published in it, but also contribute open source hardware, methods and gaze data analysis software back to the community. We are providing a set of test videos on the workshop website for which solutions should be written. All submissions will be evaluated by an independent jury according to the evaluation criteria (see below). Additionally, there is a live session scheduled for the third day in which all selected solutions can be demonstrated to the interested workshop participants. The three best solutions will receive an award.

Prize money:

1. Price: 1,000,- €
2. Price: 500,- €
3. Price: 250,- €

We would like to thank our premium sponsor SensoMotoric Instruments (SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices from

  • SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
  • Tobii Technologies [Tobii Glasses]
  • Applied Science Laboratories (ASL) / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]



Abstracts will be peer-viewed by at least two members of an international program committee. We will provide templates on the workshop website.

# 1. Oral presentation / poster call:

We are calling for 500-word abstracts on topics related to real-world eye tracking and eye movement analyses. Possible topics include, but are not limited to, eye tracking in human-machine interaction, visual search, language processing, eye-hand coordination, marketing, automatized tasks, and decision making.

Please note: All accepted contributions must register for the workshop.

# 2. Challenge:

We will provide test videos (duration 2-3 minutes) and separate text files with the corresponding gaze data on the workshop website. The gaze data consists of a timestamped list of (x,y) gaze scene video coordinates. For selected videos, frame counter information will be also available to assist with synchronization of the video and the gaze data. For the challenge we are looking for semi- and fully-automatic software
solutions for the recognition and tracking of objects over the whole video sequence. The software should provide the coordinates for the tracked objects and use this information to automatically calculate object specific gaze data, such as number of fixations and cumulative fixation durations. There are no restrictions on the way in which the relevant objects are marked and on which kind of techniques can be used to track the objects. The only constraint is that your software solution can read and process the provided videos and reports gaze specific data for the selected objects either as a text file (which can serve as input for a statistical program such as SPSS, Matlab, R oder MS Excel) or by
providing some kind of visualization. In order to allow for more time for the implementation process for the challenge a two-step submission procedure has been devised. The decision for acceptance to the challenge will be on a preliminary submitted abstract. The final evaluation and ranking of the software solutions will be based on the final abstract and the final results for a test-set of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract describing the implementation details of your proposed software solution including the following:

  • description of the underlying techniques and implementations
  • description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3page paper by adding the following details:

  • number of fixations and cumulative fixation duration details for the specified objects
  • performance data (such as computation time, number of selected objects, parallel tracking of several objects in the scene)
  • snapshot of the results

We will use results based on manual annotation to evaluate the submitted results. The following evaluation criteria will be applied:

  • quality of the automated benchmark results (region and pixel based) compared to the results given by manual annotation
  • conceptual innovation
  • performance (such as computation time, number of selected objects, parallel tracking of several objects in the scene)
  • robustness (such as such as tracking performance, general scope of the application)
  • usability

The test videos and a corresponding description of them can be found on the workshop website. Additionally, you can find a detailed description of how we perform the manual annotation. The exact description for the challenge, including the evaluation criteria and the required format for the results, will appear on the workshop website within the next 3 weeks. Please check the website regularly for updates.

Please Note: All challenge participants must register separately for access to the challenge material and the video download.



The SAGA 2013 workshop will be held at the new CITEC Research Building Interactive Intelligent Systems, which is located close to the main building of Bielefeld University. The construction of the research building ‘Interactive Intelligent Systems’ started in January 2011 and the completion will be in summer 2013. By end of the year, the research building will host 17 research groups from various disciplines such as informatics, engineering, linguistics, psychology, and sports science. It will be completed by a conference center built to accommodate up to 200 participants that has been planned as an internationally visible hallmark of this highly profiled research site (taken from:

Bielefeld University was founded in 1969 with an explicit research assignment and a mission to provide high-quality research-oriented teaching. Today it encompasses 13 faculties covering a broad spectrum of disciplines in the humanities, natural sciences, social sciences, and technology. With about 18,500 students in 80 degree courses and 2,600 staff (including approx. 1,480 academic staff), it is one of Germany’s best known medium-sized universities.

Bielefeld, the centre for science, is the economic and cultural capital of the East Westphalia economic area. The city of Bielefeld is one of the twenty largest cities in Germany, with a population of 325,000. This lively university city on the edge of the Teutoburg forest is the region’s cultural and intellectual hub. East Westphalia-Lippe is
Germany’s fifth-largest economic area and the region is home to two million people (taken from:


We would like to thank our commercial sponsors:

Premium Sponsors



SAGA 2013 Workshop Organising Committee:

Workshop Organisers:

  • Kai Essig
  • Thies Pfeiffer
  • Pia Knoeferle
  • Helge Ritter
  • Thomas Schack
  • Werner Schneider

All from the Cognitive Interaction Technology Center of Excellence at Bielefeld University

Scientific Board:

  • Thomas Schack
  • Helge Ritter
  • Werner Schneider

Jury of the Challenge:

  • Kai Essig
  • Thies Pfeiffer
  • Pia Knoeferle
  • Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:

For additional question, please contact:

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the workshop organisers
Thies Pfeiffer

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.