New joint study agreement to enhance the classification of brain activity patterns.

Jun 3, 2025

Paris, June 03, 2025: IBM (NYSE: IBM) and Inclusive Brains, a leader in non-invasive neurotechnologies and multimodal artificial intelligence, have entered a joint study agreement to experiment with advanced AI and quantum machine learning techniques. The aim of the joint study is to boost the performance of multi-modal brain-machine interfaces (BMIs).

Innovation with Positive Social Impact

BMIs have the potential to enable individuals with disabilities — particularly those who have lost the ability to use their hands or voice — to leverage connected devices and digital environments to regain control of their surroundings, eliminating the need for vocal commands or physical operation of a keyboard, screen or mouse. Inclusive Brains aims to improve access to education and employment opportunities using the insights generated in the joint study. Beyond better inclusion of people with paralysis, Inclusive Brains aims to broader societal benefits, including improved prevention of both physical and mental health issues among the wider population thanks to enhanced classifications and therefore a better understanding of brain activity patterns.

The multimodal AI systems, developed by Inclusive Brains, will be enhanced using the insights generated in the joint study with IBM, to enable real-time personalized adaptation of brain-machine interfaces to the unique abilities and needs of each user, therefore offering them more autonomy and agency in their personal and professional lives.

Applying Multimodal Adaptive AI for Accessibility and Inclusion

Inclusive Brains is leveraging IBM AI and quantum technologies and expertise to work on several key stages to assess the accuracy of brain activity classification compared with existing models, including:

  • The use of IBM Granite foundation models to generate and review code, to then create benchmarks to test hundreds of thousands of machine learning algorithmic combinations, in order to help identify the most efficient algorithms for classification and interpretation of one’s brain activity;
  • An automated selection of the most effective algorithms for each individual and their use in so-called “mental commands” (i.e. commands that do not require speech or other physical interactions) to operate workstations[1];
  •  "Mental commands," "mind-controlled," and "mind-written" are simplified terms that do not imply direct reading of commands or words from brainwaves. In the context of this joint study, these phrases describe a multimodal AI system trained with brainwaves, facial movements and expressions, eye movements, and other physiological signals to control connected devices or digital environments without requiring touch or speech. Essentially, the multimodal AI system interprets a combination of signals to infer the user's intent and translate it into action.
  • A series of open science research publications, to benefit the scientific community and the broader public;
  • Studying quantum machine learning techniques for brain activity classification purposes.

The joint study will work to ensure that the research is aligned with Inclusive Brains’ and IBM’s principles of responsible technology, including ethical considerations and recommendations for responsible neurotechnology and neural data use, like those provided in prior publications[2],[3].

“We are particularly proud to engage with innovative startups such as Inclusive Brains and to contribute to a technology that supports advancing healthcare for the benefit of the general population, by providing access to IBM’s AI and quantum technologies in a responsible manner,” said Béatrice Kosowski, President of IBM France.

“Our joint study with IBM will help Inclusive Brains develop technology for deeply personalized interactions between machines and their users. We're transitioning from the era of generic interfaces to that of bespoke solutions, crafted to adapt to each individual's unique physicality, cognitive diversity, and needs,” adds Professor Olivier Oullier, CEO & Co-Founder of Inclusive Brains, Chief AI Scientist and Chairman of the AI Institute at Biotech Dental Group, and Visiting Professor in the Department of Human-Computer Interaction at the Mohamed bin Zayed University of Artificial Intelligence (MBZUAI).

About Inclusive Brains

Inclusive Brains’ mission is to enable machines to adapt in real time to what makes each of their users unique at the physical and cognitive levels. The goal is to provide personalized assistance that not only facilitates decision-making and optimizes performance but also preserves mental and physical health. To this end, Inclusive Brains is developing multimodal cognitive agents able to detect variations in stress, attention, cognitive load and fatigue levels in real time. Inclusive Brains’ interfaces also offer people with disabilities the ability to control connected objects and digital environments with “mental commands,” using neurophysiological signals without any physical or vocal interactions. Over the past year, Inclusive Brains achieved three non-invasive world firsts in public settings with Prometheus BCI, its multimodal interface — a mind-controlled arm exoskeleton showcased at the Olympic Torch Relay; a mind-written tweet exchange with President Macron; and a mind-written parliamentary amendment. This interface was developed with Allianz Trade, the global leader in trade insurance, and further supported by GENCI[4]’s compute capabilities, run on its Jean Zay Supercomputer at IDRIS (CNRS)[5]. Named startup of the year in 2024, its founders, Prof. Olivier Oullier and Paul Barbaste, were recognized amongst the AI Inventors of the Year by a committee constituted of Nobel laureates and leaders in science and industry. They were also honored twice by the École Polytechnique Foundation (X-Tech Impact and X-Grant High Impact Awards) and at the Handitech Trophy (cutting-edge research and digital inclusion).

Visit www.inclusive-brains.com

About IBM 

IBM is a leading provider of global hybrid cloud and AI, and consulting expertise. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Thousands of governments and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and consulting deliver open and flexible options to our clients. All of this is backed by IBM's long-standing commitment to trust, transparency, responsibility, inclusivity and service. 

For more information, visit ibm.com.

Media Contacts:

IBM
Gaëlle Dussutour
Tel.: + 33 (0)6 74 98 26 92
dusga@fr.ibm.com

Inclusive Brains
Tel.: + 33(0)6 03 02 50 24
media@inclusive-brains.com

Weber Shandwick for IBM
Louise Weber
Tel.: + 33 (0)6 89 59 12 54
ibmfrance@webershandwick.com


[1] "Mental commands," "mind-controlled," and "mind-written" are simplified terms that do not imply direct reading of commands or words from brainwaves. In the context of this joint study, these phrases describe a multimodal AI system trained with brainwaves, facial movements and expressions, eye movements, and other physiological signals to control connected devices or digital environments without requiring touch or speech. Essentially, the multimodal AI system interprets a combination of signals to infer the user's intent and translate it into action.

[2] AI and Neurotechnology (CACM)

[4] GENCI: Grand équipement national de calcul intensif

[5] IDRIS: Institut du développement et des ressources en informatique scientifique; CNRS: Centre national de la recherche scientifique

Release Categories

more articles

  • Artificial intelligence
  • Research and innovation

| Jun 3, 2025

  • Artificial intelligence

| Jun 2, 2025

  • Artificial intelligence

| Jun 2, 2025