Comparison of EEG Recordings During Guided and Self-paced Button-Press Tasks
Abstract
Brain-computer interfaces (BCIs) are systems that enable direct communication between the brain and external devices by translating neural activity into commands. These systems hold immense potential in neurorehabilitation, assistive technology, and communication [1]. To enhance BCI design and performance, it is crucial to understand the neural mechanisms that differentiate voluntary movements from externally triggered ones. Electroencephalography (EEG) is a commonly used non-invasive method for measuring brain signals in BCIs due to its high temporal resolution and relative ease of use.
The purpose of this study is to compare and predict EEG recordings obtained during two distinct button-press tasks: a guided task and a self-paced task. The study, aims to contribute to the understanding of differences between stimulus-driven and spontaneous actions in the field of BCIs, and with that validate or invalidate the use of externally triggered tasks in BCI system development. Participants are required to complete two tasks while wearing an EEG cap. In the self-paced task, participants are instructed to press a simple button whenever they want to for seven minutes. In the guided task, participants observe a screen and are required to press the button each time a visual stimulus (a screen flash) occurs at irregular intervals ranging from 5 to 7 seconds. Both tasks are repeated for each hand.
Moreover, using machine learning models, the study aims to predict button-press events based on the obtained EEG data. The collected EEG signals will undergo preprocessing, including artifact removal, filtering, and segmentation, to isolate event-related potentials (ERPs) and relevant features for analysis. Using the data we intend to train machine learning classifiers (e.g., support vector machines, random forests, or neural networks) to distinguish between guided and self-paced button presses and predict button-press events.
The study will compare EEG patterns across guided and self-paced tasks, identifying neural correlates that distinguish spontaneous from stimulus-driven motor actions. Preliminary expectations are that guided actions will show more pronounced, time-locked ERPs, while self-paced actions may exhibit anticipatory motor potentials like the readiness potential. This comparison will enhance understanding of the neural mechanisms underlying voluntary and externally triggered movements and inform the development of BCIs. The predictive modeling will demonstrate the feasibility of forecasting motor intentions using EEG, which will further validate the use of predictive modeling in the context of neurorehabilitation and assistive technology.
References
[1] U. Chaudhary, N. Birbaumer, and A. Ramos-Murguialday, “Brain–computer interfaces for communication and Rehabilitation,” Nature Reviews Neurology, vol. 12, no. 9, pp. 513–525, Aug. 2016. doi:10.1038/nrneurol.2016.113