Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Brain-computer interfaces (BCIs) are used to understand brain functioning and develop therapies for neurological and neurodegenerative disorders. Therefore, BCIs are crucial in rehabilitating motor dysfunction and advancing motor imagery applications. For motor imagery, electroencephalogram (EEG) signals are used to classify the subject's intention of moving a body part without actually moving it. This paper presents a two-stage transformer-based architecture that employs handcrafted features and deep learning techniques to enhance the classification performance on benchmarked EEG signals. Stage-1 is built on parallel convolution based EEGNet, multi-head attention, and separable temporal convolution networks for spatiotemporal feature extraction. Further, for enhanced classification, in stage-2, additional features and embeddings extracted from stage-1 are used to train TabNet. In addition, a novel channel cluster swapping data augmentation technique is also developed to handle the issue of limited samples for training deep learning architectures. The developed two-stage architecture offered an average classification accuracy of 88.5 % and 88.3 % on the BCI Competition IV-2a and IV-2b datasets, respectively, which is approximately 3.0 % superior over similar recent reported works.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.medengphy.2024.104154 | DOI Listing |