Projects

Project to explore expertise as a function of visual behaviour using eye tracking technology. This is applied to the medical domain where we look specifically at how eye tracking data can be used to better understand how practitioners perform when carrying out visual search tasks, such as the interpretation of electrocardiograms.

ID: 12

Studies: No Studies

The aim of the WAUX project is to investigate the relationship between web accessibility and user experience.

ID: 9

Studies: No Studies

Getting insight into learnability is a challenging task. A naturalistic observation is required, with fine grained information that allows for extraction of all sorts of high level constructs. Obtrusive set ups, coarse interaction data, and predefined tasks need to be avoided in order to obtain naturalistic interaction data. We developed a low-level interaction capture tool that allowed long-term collection of data from any Web site. We then analysed the obtained data to look for tendencies that
...

ID: 7

Studies: Behaviour extraction from longitudinal low-level data, Capture solution design

Sighed people have access to large amounts of visual information simultaneously; blind people, however, experience an audio translation in a single, linear stream. Although these two experiences initially appear quite different, in fact they share the concept of the `locus of attention'; a single source or location of sensory input that a person attends to at a given time. If we can determine how sighted people move their attention through visual information, we may be able to predict a sequential
...

ID: 6

Studies: Describing Visual Art

The aim of the divided attention project is to build an understanding how people interact with multiple devices. Our primary use-case is that of people watching TV while viewing or interacting with content on a second device, such as a tablet computer. Understanding how attention is split, and the factors that influence where people allocate their attention can inform provision of second-screen content.

ID: 2

Studies: SAGE (Synchronised Attention Grant Enabler)

Copyright (c) 2008 - 2014 The University of Manchester and HITS gGmbH