Investigating Feature Set Decisions for Mental State Decoding in Virtual Reality Based Learning Environments

Abstract

Brain-Computer Interfaces (BCIs) combined with Virtual Reality (VR) enable the development of user-aware systems for individualized learning that monitor the learner’s current mental states and adapt content to their individual skills and needs. We investigate feature set decisions extracted from functional near-infrared spectroscopy (fNIRS) signals and its use in conventional machine learning (ML)-based decoding of working memory. Eleven volunteers participated in a VR study using a visuo-spatial n-back paradigm with simultaneous fNIRS measurements. Single subject and overall decoding performance were compared for different feature sets including exploration of single feature contribution and their localisation within the prefrontal cortex. Our results prove that feature sets combining oxygenated (HbO) and deoxygenated hemoglobin (HbR) features using a sequential feature forward selection have the highest performance. More specifically, HbR peak-to-peak features from premotor regions and right and mid-dorsolateral prefrontal cortex contributed most to the decoding performance. Our results emphasise the need of analysing ML features in mental state decoding and aim to provide empirically supported decision recommendations to reach the next step towards future online decoding pipelines in real-world VR-based learning applications.

Publication
Proceedings of the AHFE 2023: Advances in Neuroergonomics and Cognitive Engineering