Exploring the Effectiveness of a Conversational Brain-AI Interface in Aphasic Patients
Abstract
The development of advanced technologies, such as the Conversational Brain-Artificial Intelligence Interface (cBAI), presents a novel approach to assistive communication devices for individuals with aphasia. Aphasia is characterized by impaired receptive and expressive communication abilities, which mask inherent competence and affect social functioning and quality of life [1]. This study aims to explore the effectiveness of cBAI in enhancing communication for aphasic patients by leveraging neurophysiological data, specifically Visual Evoked Potentials (VEPs), to interpret their intentions, bypassing the need for imagined speech capabilities.
Aphasia typically results from focal brain lesions in the left language-dominant hemisphere, posing challenges for traditional Brain-Computer Interfaces (BCIs) that rely on intact cognitive functions. The cBAI technology, developed by the Research Group Neuroinformatics in Vienna, addresses this gap by using artificial intelligence to interpret high-level intentions from VEPs, facilitating communication without requiring verbal speech. The interface is designed to be minimalistic, offering six context-generated keywords along with a spoken sentence in written form. Users select a keyword, and another AI model generates a full sentence from it. They can then correct the sentence, generate additional keywords, indicate an inability to respond, or conclude the conversation [2]. The interface can be operated via an EEG cap, eye tracker, or mouse click, depending on the user's abilities and preferences.
This study, conducted with a colleague from computer science, investigates both the human-computer interaction (HCI) aspects and the neurophysiological effectiveness of cBAI deployment. I focus on the HCI aspects, assessing usability and functional effectiveness through a self-designed questionnaire tailored to the participants' abilities. This qualitative method will gather user feedback from participants who have partially recovered and can communicate efficiently, providing valuable insights into the technology's usability.
The research includes participant recruitment in Vienna, testing the cBAI with individuals with aphasia, and iterative refinement of the interface based on feedback and observed interaction challenges. The HCI analysis developed during the project will offer recommendations for future user tests with individuals with aphasia.
Expected outcomes include critical insights into the practical application of cBAI technology for aphasia patients, potentially enhancing their quality of life by providing an accessible and effective communication tool. Additionally, this research could inform future developments in neuroadaptive technologies and AI-assisted communication tools, influencing the design and implementation of assistive devices to promote greater inclusion and accessibility for aphasic individuals in social and professional contexts.
References
[1] I. Papathanasiou, P. Coppens, and C. Potagas, Aphasia and Related Neurogenic Communication Disorders, 3rd ed., Jones & Bartlett Learning, 2022.
[2] A. Meunier, M. R. Žák, L. Munz, S. Garkot, M. Eder, J. Xu, and M. Grosse-Wentrup, "A Conversational Brain-Artificial Intelligence Interface," 2024. [Online]. Available: arXiv:2402.15011