I have also developed other experiments to understand social interactions that are more complex in design and the questions asked [26-30] add in elements of motor control and the mirror neuron system in the parietal lobe.
A common social interaction involves one person smiling and without much thought, a second person tends to smile back at them without any cognitive effort. How does this emotional contagion happen and what are the neural circuits that regulate this sensory-motor interaction between people? I aim to answer these questions through development of a controlled but naturalistic interaction in which one person watches emotional videos and a second participant watches the first observe the video.
The interaction will be recorded using a series of programmable lights, HD cameras, eye-trackers, 3D scanners, and physiologic monitors. Neural signals will be recorded using fNIRS and EEG on both partners. Partners are asked to rate the emotional valence of each interaction and this information is regressed into the neural signals recorded during the interaction.
Pilot behavioral results determined using the open face coding system (OpenFace) show that when a participant observes a video that makes them smile (such as baby pandas rolling on the ground) the smile information is transmitted to their partner, and subconsciously, the partner also smiles. If the video content produces alternate emotions and facial expressions (viewing of dirty toilets), this is also transmitted, and the facial coding system shows high correlation between the partners. Interestingly, coherence in social centers of the brain also show increased coherence during these interactions compared to baseline or when trial types are scrambled, suggesting that specific information is being coded in the facial expression and the brain has “receptors” to interpret this socially cued information.