CPSC 4120/6120 Eye Tracking Methodology and Applications
Fall 2023
- Team 1: Experiment: Inattentional Blindness
- Jordan Payne
- David Barnett
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- there is a large number of search terms: change blindness,
attentional blindness, inattentional blindness, etc.
- there are many applications as well, e.g., driving simulators,
games, graphics collisions, crowd simulations, advertising,
and so on.
- papers with still image stimuli may be most relevant, but
there are some interesting desktop / interface type
applications that may be interesting
- Lit review
- Lee and Ahn, 2014 (one of many, this on one banner ads)
- Hobson et al., 2012 (flicker paradigm, alchohol craving/consumption)
- Pappas et al., 2005 (gorilla study, eye tracker, claims to be the first)
- Dill and Young, 2015 (flight deck)
- Simons and Levin, 1997 (early paper on change blindness)
- Jensen et al., 2011 (inattentional vs. change blindness)
- Simons et al., 2000 (early paper on change blindness, still images)
- Karacan et al., 2010 (change blindness, desktop virtual environments)
- Simons, 2010 (change blindness, early paper, review)
- Smith and Henderson, 2008 (edit blindness)
- Nikoloev et al., 2011 (change detection, eye fixation-related potentials [EFRP])
- Ramey et al., 2022 (unconscious change detection)
- Zelinsky, 2001 (change detection)
- Simons and Chabris, 1999 (Gorilla study)
- Marwecki et al., 2019 (hiding VR scene changes)
- Memmert, 2006 (age, expertise, inattentional blindness)
- Gelderblom and Menge, 2018 (Gorilla study again, interface design)
- Li et al., 2023 (task load-induced inattentional blindness)
- Hermens and Zdravkovic, 2022 (objects and shadows)
- Team 2: Experiment: Neurodivergent / Neurotypical Gaze
- Liz Chandler
- Victoria Hill
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- consider simple stimulus, e.g., circular motion, compute error
- programming this in PsychoPy may be doable
- looks like heading towards a perception of faces study
- transition matrices could be useful here
- Lit review
-
Schrader et al., 2021 (see stimulus and error compuation as well as references)
- Artiran et al., 2022 (autism, VR, avatars)
- Wong et al., 2023 (webcam eye tracking, classroom setting)
- Rosqvist et al., 2023 (reading text, autism)
- Parkington, 2021 (faces, autism)
- Team 3: Experiment: Perception of Coookie Consent Pop-Ups
- Yizhou Liu
- Samuel Crooks
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- do not use live web pages, use mock web pages, or images thereof
- Lit review
- Team 4: Experiment: Visual Search while Driving
- Ethan Butler
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- consider hypothesis, indep. variables, experimental design
- static hazard detection test looks like a really good idea for
an eye-tracking study
- Lit review
- Team 5: Experiment: Image Saliency: Art vs. AI
- Alek Moses
- Ashley Clark
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- consider hypothesis, indep. variables, experimental design
- the nature of the stimulus (art) may be difficult to control
- maybe think about image saliency as a potential covariate
- studying modified art looks interesting as does specific
art genres (e.g., Mondrian, individual action, etc.)
- Lit review
- Team 6: Experiment: Eye Contact in Video Conferencing
- Christopher Rodriguez
- Thomas Personett
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- a really compelling study might be of faces that maintain eye
contact vs. those that do not...something along those lines
- Lit review
- Team 7: Experiment: Looking at Art and Titles
- Liam Leahy
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- artwork with title vs. no title
- can do this between- or within-subjects.
- if within-subjects, must balance which image is seen first,
one with title, then without title and vice versa
- Lit review
- Team 8: Experiment: Reading and Distractions
- Timothy Lau
- Tristan Jansen
- Proposal [PDF]
Report [PDF]
Talk [PPT]
- Notes
- make text font large and record gaze over words
- vary position as indep. factor, e.g., text on left, text on right
- vary distraction as distracting or not distracting (relative
to task)
- provide questionnaires to gauge comprehension
- using code as the reading task would be great
(see Eye Movements in Programming conference, e.g.,
Bonita Sharif)
- using the stroop task is a good idea, that alone will likely
produce interesting results, no need to complicate with
other factors
- Lit review