Datasets

We provided the materials of the eye tracking study (the information sheet, consent form, questionnaire and web pages with their visual elements) in this online external repository. We also provided the individual scanpaths in terms of the visual elements of the web pages with all our detailed statistical data analysis.

Contributor: Sukru Eraslan

Timing data for participants' attention shifts between TV and tablet, as determined from video analysis.

Contributor: Andy Brown

For each participant, one text file indicating when attention shifted onto the tablet (column 1) and away from the tablet (column 2). Times are in s from start of video.

Contributor: Andy Brown

Java source code for the capture tool server.

The project needs to be exported into a jar file named "changedUsaProxy.jar". To start the project the following command is recommended:

xterm -e "cd /project/folder; echo DO NOT CLOSE THIS WINDOW; sh startChangedUsaProxy2.0.sh"

This command can be included in the startup of the system, so the tool restarts if the server gets restarted.

Contributor: Aitor Apaolaza

This .zip file contains the eye tracking study documents (Information Sheet, Consent Form and Questionnaire), the pages with their visual elements, and the trending scanpaths identified with the first, second and third groups of participants for the browsing and searching tasks. The full detailed sample run is also provided.

Contributor: Sukru Eraslan

This .zip file contains the experiments’ data including the materials (the saved versions of the Apple, Babylon, AVG, Yahoo, Godaddy and BBC web pages, information sheet, consent form and questionnaire) and the individual scanpaths in terms of the visual elements of the web pages. It also includes the average, median, standard deviation, minimum and maximum similarities between the trending scanpaths of the entire group and the 100 random subsets of each group size on each web page for the browsing
...

Contributor: Sukru Eraslan

The materials, task descriptions and eye tracking data are available in this repository.

Contributor: Sukru Eraslan

This repository includes the documents of our eye-tracking study conducted with people with autism and neurotypical people on different web pages.

Contributors: Sukru Eraslan, Victoria Yaneva and Yeliz Yesilada

The Gaze Modelling for Time Series Medical Data (GazeMod) ECG project is a project aimed at presenting computational models, based on visual cognition obtained from medical experts viewing and interpreting electrocardiograms.

Contributor: Alan Davies

The implementation of our proposed approach with C programming language is provided. All the materials and data used for the evaluation of our approach can also be found in this repository.

Contributor: Sukru Eraslan

Lists of interactions with the tablet from each participant in the study. Each touch was recorded, with the time, the effect (1 -> successful; 2-> correct action, but didn't have desired effect (e.g., touch not recognised by iPad); 3 -> invalid action (e.g, attempt to swipe)), and a note on what the action was.

Contributor: Andy Brown

Remotely captured interaction data for the whole experiment.

The interaction data are stored in a single JSON file
(experiment2-userEvents.json), and the data for individual
participants must be separated. The following timestamps are the time
the clip started for each participant:

sessions = {'P01': '2014-03-11,09:18:04:795',\
'P03': '2014-03-11,10:18:09:943',\
'P04': '2014-03-11,10:46:50:195',\
'P05': '2014-03-11,11:10:03:137',\
'P06': '2014-03-11,11:58:49:421',\
'P07': '2014-03-11,12:53:27:058',\
...

Contributor: Andy Brown

Contributor: Markel Vigo

Necessary JavaScript to be added to all Web pages in order to send information to the server.
There are two important parameters to be set before adding the script to the page:
window.protectedIds: if necessary, the id attribute of the sensitive elements in the page can be listed here in order to prevent the capture of any interaction with them.
window.webpageIndex: identifies the site in which the script is being deployed. The tool registers the url of the captured events, but this index is
...

Contributor: Aitor Apaolaza

Contributor: Markel Vigo

The eye tracking study materials (the information sheet, consent form, questionnaire and web pages with their visual elements) are provided here. This repository also involves the individual scanpaths in terms of the visual elements of the web pages. The resulting scanpaths of the algorithms are also provided with their similarities to the individual scanpaths.

Contributor: Sukru Eraslan

This file contains the interview and comment data collected during the MethodBox study, comparing a 'Web Search' interface (MethodBox) with a 'Classical IR' interface for discovering variable data in the UK Data Archive.

It includes: participants' overall interface preference and the reasons for this; comments made when asked to rate the interface (see 'MethodBox_Quant_Data') and other comments made while completing the tasks.

Contributor: Caroline Jay

This file contains the following data:

Ratings individual participants gave to the 'Web Search' (MethodBox) and 'Classical IR' (ESDS) interfaces to the UK Data Archive (learnability, overall ease of use, ease of use for each task, satisfaction with the interface and confidence in their answers).

Task completion times.

Task correctness scores.

Contributor: Caroline Jay

This .zip archive contains two .xls files with following data:

PWA-UX: Ratings individual participants gave in the post-interaction questionnaire about their interaction (including UX dimensions, emotional reactions and their perceived web accessibility) with each of the four websites.

AI-UX: Ratings individual participants gave in the post-interaction questionnaire about their interaction with each website, and the accessibility indexes (assessed by different accessibility evaluation methods)
...

Contributor: Amaia Aizpurua

This .zip archive contains two .xls files with following data:

PWA-UX-RESULTS: Results from the statistical test (Kendall's correlation coefficient) applied to the perceived web accessibility (PWA) and UX ratings that individual participants gave in the post-interaction questionnaire about their interaction (including UX dimensions, emotional reactions and their perceived web accessibility) with each of the four websites.

AI-UX-RESULTS: Results from the statistical tests (Kendall's, Spearman's
...

Contributor: Amaia Aizpurua

The Python implementation of the STA algorithm is provided in this repository. The evaluation data including the materials (the web pages, information sheet, consent form and questionnaire) and the individual scanpaths in terms of the visual elements of the web pages can also be accessed from here. This repository also includes the full run of the algorithm with nine scanpaths on the Apple web page as an example. Furthermore, it provides the detailed data from the experiments conducted to investigate
...

Contributor: Sukru Eraslan

Eye-tracking data from the tablet. the tsv data start from approximately, but not exactly, the start of the TV clip.

Contributor: Andy Brown

The dataset used for the experiments is provided in this online repository. It consists of the saved versions of the web pages used, information sheet, consent form, questionnaire and individual scanpaths in terms of the AOIs of the web pages. The repository also includes all the raw data from the experiments.

Contributor: Sukru Eraslan

Description of Audio tour of a limited number of paintings in the Lowry.

Contributor: Simon Harper

Export from Tobii Studio of the eye-tracking data from the television eye-tracker. The tsv data start from approximately (but not exactly) the start of the video.

Contributor: Andy Brown

Copyright (c) 2008 - 2014 The University of Manchester and HITS gGmbH