Project Details
Description
The Themes Research & Innovation Fund 2018 awarded this Challenge-Led project £15,000
Layperson's description
Finding our way around is a crucial skill necessary for everyday life: being successful at navigation depends on a range of mental abilities dealing with the constant flow of physical and social information. However, many vulnerable groups struggle with such navigation, thus impairing quality of life. The focus has largely been on how people use physical landmarks to navigate, and much less is known about how social events and stimuli affect us finding our way around. Indeed, the fields of spatial and social cognition have been treated rather independently so far, yet important new findings suggest that common brain structures process both spatial and social distance. So, does this mean that the spatial and social information we receive during navigation interplay? And does it affect our decisions? Exciting new findings indicate that physical distance influences our perception of facial expressions, so if we move towards faces they are processed differently than if we move away from them. In addition, encountering other people appears to affect how we decide to move around. Therefore, we need to better understand how attention towards the facial expression of people we encounter during navigation affects our decisions. We predict that facial expressions generally increase social attention (which can be either distracting or helpful). Importantly, we predict that the valence of the facial expression (neutral, positive or negative) affects performance whereby a positive facial expression will facilitate navigation and negative facial expression will impair navigation.
For this purpose, we will pilot the use of an immersive Virtual Environment coupled with eye-tracking technology in a cross-faculty collaboration (Science and CCI). Immersive Virtual Reality offers a unique and ecologically valid environment to present controlled stimuli and record performance in a task. Eye-tracking technology allows us to record attention to an object or social partners present in the environment. In this project, we will test participants and record their attention to different landmarks as well as their performance while they navigate in a complex maze. First, the participants will be trained to find a target in the complex maze. During the testing phase, we will introduce objects for control trials or realistic avatars displaying different facial expressions (neutral, positive or negative). Using eye-tracking technology, we will record how long the participants pay attention to the different stimuli. Their performance in the maze will be measured by the time spent to find the target and the number of errors made in the maze. This project will be important to understand the impact of social cues on our spatial movements. This project has potential implications for better understanding how social cues shape our navigation and how it could for instance affect social coordination. We believe this research project will launch a cross-disciplinary programme that will have potential applications for the ageing population where navigation can be impaired. It might also help to better understand the navigation of populations with specific social needs (e.g. adults on the autistic spectrum, children). Lastly it might be beneficial to architecture and urban design for improving easier wayfinding.
For this purpose, we will pilot the use of an immersive Virtual Environment coupled with eye-tracking technology in a cross-faculty collaboration (Science and CCI). Immersive Virtual Reality offers a unique and ecologically valid environment to present controlled stimuli and record performance in a task. Eye-tracking technology allows us to record attention to an object or social partners present in the environment. In this project, we will test participants and record their attention to different landmarks as well as their performance while they navigate in a complex maze. First, the participants will be trained to find a target in the complex maze. During the testing phase, we will introduce objects for control trials or realistic avatars displaying different facial expressions (neutral, positive or negative). Using eye-tracking technology, we will record how long the participants pay attention to the different stimuli. Their performance in the maze will be measured by the time spent to find the target and the number of errors made in the maze. This project will be important to understand the impact of social cues on our spatial movements. This project has potential implications for better understanding how social cues shape our navigation and how it could for instance affect social coordination. We believe this research project will launch a cross-disciplinary programme that will have potential applications for the ageing population where navigation can be impaired. It might also help to better understand the navigation of populations with specific social needs (e.g. adults on the autistic spectrum, children). Lastly it might be beneficial to architecture and urban design for improving easier wayfinding.
Status | Finished |
---|---|
Effective start/end date | 1/05/19 → 30/04/20 |
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.