An Adaptive Bayesian Calibration Approach with Reduced Training Effort in P300 ERP-Based Brain-Computer Interfaces Restricted; Files Only

Li, Chenyang (Spring 2025)

Permanent URL: https://etd.library.emory.edu/concern/etds/3r074w44q?locale=en++PublishedPublished
Published

Abstract

Brain-Computer Interfaces (BCIs) enable direct communication between the brain and external devices, offering potential applications in assistive technologies and neuro-rehabilitation. This study proposes a Bayesian sequential updating framework to enhance event-related potential (ERP) estimation and classification accuracy in P300 ERP-based BCIs. Using a Gaussian Mixture Model (GMM) with Gaussian Processes (GP), we integrate Markov Chain Monte Carlo (MCMC) sampling to iteratively update model parameters as new data become available. Simulations and real data analysis demonstrate that the model effectively captures ERP structure, with credible bands narrowing over sequences, indicating increased estimation confidence. Our model achieves 85%-90% accuracy after four sequences in real data analysis, showing a clear advantage over traditional stepwise linear discriminant analysis (SWLDA), which stabilizes at around 80%. Informative priors derived from early labeled data significantly improved early-stage ERP estimation and classification, especially under limited data conditions. The sequential updating approach allows the model to refine classification performance with minimal labeled data while progressively improving accuracy.

Table of Contents

1 Introduction 1

1.1 Background 1

1.2 Streaming Data and Current Work 1

1.3 Our Contributions 2

2 Methods 4

2.1 Notations 4

2.2 Gaussian Mixture Model (GMM) 4

2.3 Model Assumption 5

2.4 Parameter Estimation by Sequential Data 6

2.4.1 Sequential Bayesian Inference Framework 6

2.4.2 Adaptive Prior Updating Strategy 6

2.5 Prediction Accuracy 7

2.5.1 Prediction Score 7

2.5.2 Posterior Probability Update 8

3 Posterior Inferences 10

3.1 Gaussian Process and Prior specifications 10

3.2 MCMC and Likelihood Computation 11

3.2.1 Likelihood Computation 12

3.2.2 Handling Label Switching and Data Imbalance in Gaussian

Mixture Models 12

4 Simulations 14

4.1 Simulation Setup 14

4.2 Model Settings and Diagnostics 15

4.3 Inference Results 15

4.4 Prediction Results 19

5 Real Data 21

5.1 Experimental Design 21

5.2 Data Pre-processing 22

5.3 Model Setting 22

5.4 Results 23

6 Discussion 27

Appendix A Results from Other Participants 29

References 37

About this Master's Thesis

Rights statement
  • Permission granted by the author to include this thesis or dissertation in this repository. All rights reserved by the author. Please contact the author for information regarding the reproduction and use of this thesis or dissertation.
School
Department
Subfield / Discipline
Degree
Submission
Language
  • English
Research Field
Keyword
Committee Chair / Thesis Advisor
Committee Members
Last modified Preview image embargoed

Primary PDF

Supplemental Files