英语轻松读发新版了,欢迎下载、更新

Investigating human interaction: Study combines AI with simultaneous dual-brain neuroimaging for first time

2025-02-26 21:19:05 英文原文

作者:by University of Trento

Investigating human interaction: When we are in sync
Credit: University of Trento

Bringing research from the lab to the home, from a controlled environment to real life, can be a way to understand human interaction. As technology evolves, its potential grows, driving both scientific exploration and real-world applications. In this sense, the authors of a new study have taken a step forward to understanding what happens at the brain level when two people come into contact and interact with each other, such as during a conversation, when giving each other a gift, or in other situations of cooperation.

The methodology and results are described in a paper titled "Emotional content and semantic structure of dialogues are associated with Interpersonal Neural Synchrony in the Prefrontal Cortex," appearing in the journal NeuroImage.

The paper was authored by Alessandro Carollo and Gianluca Esposito (corresponding authors), Massimo Stella, and Andrea Bizzego of the University of Trento (Department of Psychology and Cognitive Science) as part of an international collaboration with Mengyu Lim of the Nanyang Technological University of Singapore.

Their work has shed new light on the association between the way in which people communicate, in terms of emotions and language, and their brain activity.

"For the first time, we have combined AI techniques with neuroimaging measurements obtained on two people at the same time. We worked in a laboratory setting, but we tried to create less controlled situations than usual, so that each participating couple was free to invent a dialogue as well as to imagine giving each other a gift and being surprised to receive it," says Carollo, first author of the study.

The research, which was conducted in the laboratories of the Department of Psychology and Cognitive Science of the University of Trento in Rovereto, involved 42 pairs of participants (84 individuals) between 18 and 35 years old.

"We combined artificial intelligence techniques with the most advanced brain imaging technology to study how emotions and the structure of language influence brain activity in interactions. This study reveals that when two people interact, their brain activity is synchronized, especially in the prefrontal cortex. Emotional content and the structure of language are connected to this neural synchrony," explains Esposito.

Investigating human interaction: When we are in sync
Schematic summary of the experimental procedure. (A) Setup of devices and sitting arrangement of dyads during the experimental sessions. (B) Image from Azhari et al. (2019). Schematic diagram depicting the position of 20 functional near-infrared spectroscopy (fNIRS) channels and their corresponding positions to measure the activity of the superior frontal gyrus (SFG), middle frontal gyrus (MFG), inferior frontal gyrus (IFG), and anterior prefrontal cortex (aPFC). (C) Emotional content of sentences computed using EmoAtlas (Semeraro et al., 2025). (D) Syntactic parsing of sentences. (E) Representation of the syntactic/semantic structure of sentences using textual forma mentis networks (Stella et al., 2019). Credit: NeuroImage (2025). DOI: 10.1016/j.neuroimage.2025.121087

The dialogues were transcribed by hand, then artificial intelligence techniques were used to encode the transcriptions and obtain emotional and syntactic/semantic indexes of the conversations.

For neuroimaging measurements, functional near-infrared spectroscopy (fNIRS) was used.

This technique is similar to an electroencephalogram, but is less invasive than and other methods, and is capable of recording the dynamics of hemoglobin, the molecule that carries oxygen in the blood, in different brain areas. With a light source, which emits beams of photons, and a photodetector placed on a helmet, the amount of light absorbed by hemoglobin is measured and is thus evaluated.

Carollo explains, "It is an easy-to-carry and lightweight technique; it only takes a small box with a pair of caps and their cables. Then you plug it into a laptop computer and that is all you need to study human interactions. The goal is to bring research from the lab to the home, from the to real life, where people are free to talk to each other and interact."

The contribution of the research team is promising.

Esposito states, "The best approach seems to be the transdisciplinary one, which integrates and semantic/syntactic information. The results obtained on neuronal synchronization have a number of interesting implications.

"The study shows that emotions and language structure influence our conversations and the neural processes that then guide how we interact with each other. This opens up new avenues for research into human interactions. We think of interactions between parent and child, between partners, friends, or simply two strangers who find themselves interacting by chance."

More information: Alessandro Carollo et al, Emotional content and semantic structure of dialogues are associated with Interpersonal Neural Synchrony in the Prefrontal Cortex, NeuroImage (2025). DOI: 10.1016/j.neuroimage.2025.121087

Citation: Investigating human interaction: Study combines AI with simultaneous dual-brain neuroimaging for first time (2025, February 26) retrieved 27 February 2025 from https://medicalxpress.com/news/2025-02-human-interaction-combines-ai-simultaneous.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

关于《Investigating human interaction: Study combines AI with simultaneous dual-brain neuroimaging for first time》的评论


暂无评论

发表评论

摘要

A new study combines AI techniques with simultaneous dual-brain neuroimaging to investigate how emotions and language structure influence neural synchrony in the prefrontal cortex during human interaction. Conducted by researchers at the University of Trento and Nanyang Technological University, the research involved 42 pairs of participants and used functional near-infrared spectroscopy (fNIRS) to measure brain activity. The study reveals that emotional content and language structure are associated with neural synchrony in the prefrontal cortex during conversations and cooperative activities. This work opens new avenues for understanding human interactions across various social contexts.