Author ORCID Identifier

0000-0002-9789-261X

Date of Graduation

12-2023

Document Type

Dissertation (PhD)

Program Affiliation

Neuroscience

Degree Name

Doctor of Philosophy (PhD)

Advisor/Committee Chair

Valentin Dragoi

Committee Member

Fabricio Do Monte

Committee Member

Xaq Pitkow

Committee Member

Harel Shouval

Committee Member

Danielle Garsin

Abstract

The motivation and capacity to be social is necessary for human survival. Successful learning of complex, prosocial behavior stems from the ability to perceive and respond to visual cues, such as the body language and facial expressions, from others in our environment. This dependence on visual information to guide social interaction is especially true for humans and non-human primates. Although recent studies in primate neurophysiology discovered neurons that can encode socially relevant variables, like reward and social actions, the underlying neural mechanisms of learning advanced social concepts, such as cooperation, are not well understood. Further, previous work has identified brain structures that are activated when restrained subjects passively view other agents in-person or socially interacting animals in videos, but examining how the brain processes social signals originating from interacting conspecifics in real time to initiate goal-directed behavior has not been explored – until now.

Limitations include the lack of a suitable framework to study how social cognition emerges in real time, and a lack of a neural population level approach to record from multiple brain regions simultaneously while animals perform naturalistic tasks. To this end, we developed a novel paradigm that combines behavioral monitoring with wireless eye tracking and neural recordings to study how pairs of freely moving, interacting macaques use visually-guided signals to learn social cooperation for food reward. By recording from visual (V4) and prefrontal (dorsolateral prefrontal cortex; dlPFC) brain regions simultaneously, I examined how visual representations relevant for social interactions are communicated from sensory to executive areas that encode reward and decision making.

During learning, animals improve coordination of their actions and likelihood of cooperating, and they cooperate more quickly. Notably, animals become more likely to cooperate after viewing a social cue, such as the reward or partner monkey. As social learning occurs, V4 and dlPFC refine the representation of viewing the reward or partner monkey by distributing socially-relevant information among neurons within each area. Additionally, dlPFC improves encoding of each animal’s decision to cooperate, especially when social cues are viewed, highlighting the importance of visual monitoring to determine actions of oneself and predict or even influence other’s actions in the creation of purposeful social behavior. Finally, learning social events increases the coordinated spiking between visual and prefrontal cortical neurons, with coordinated V4-dlPFC cells contributing more to encoding of social variables within each area. These results are the first to demonstrate how the visual-frontal cortical network prioritizes relevant sensory information to facilitate learning social interactions while freely moving macaques interact in a naturalistic environment.

Keywords

social cognition, learning, prefrontal cortex, visual cortex, V4, dlPFC, eye tracking, non-human primate, cooperation, wireless neural recording

Available for download on Tuesday, May 28, 2024

Share

COinS