089 Vgaze

Development of analysis algorithms and metrics to characterize gaze behavior when performing complex mobility tasks in virtual environments

Intervention area

Mobility

Start date

April 1, 2018

End date

May 1, 2019

We have created a virtual reality (VR)-based paradigm which allows testing the impact of phone messages on the performance of complex locomotor tasks such as circumventing pedestrians moving in a community environment. Until recently, the technology we used did not allow for eye movement recording within the virtual environment, making the interpretation of the data difficult. Collecting this type of information in the context of a visually-guided task such as locomotion is essential to understand how the sensory uptake of visual information about the environment (obstacle, end destination) is altered by phone messages (text and audio messages). In the context of this research mandate, we are aiming to characterize the 3D kinematic and gaze behavior of healthy young and older participants receiving messages in text vs audio format as they perform complex locomotor tasks in a community environment (e.g. (avoidance of virtual pedestrians and cars at a busy intersection).

Lead applicant

Anouk Lamontagne

Université McGill

Physical and Occupational Therapy

5 INTER mandates

Team

Eva Kehayia

Université McGill

Physical and Occupational Therapy

3 INTER mandates

Joyce Fung

Université McGill

Physical and Occupational Therapy

4 INTER mandates