Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
Background And Objective: Electrocardiogram (ECG) is one of the most important diagnostic tools in clinical applications. Although deep learning models have been widely applied to ECG classification tasks, their accuracy remains limited, especially in handling complex signal patterns in real-world clinical settings. This study explores the potential of Transformer models to improve ECG classification accuracy.
Methods: We present Masked Transformer for ECG classification (MTECG), a simple yet effective method which adapts the image-based masked autoencoders to self-supervised representation learning from ECG time series. The model is evaluated on the Fuwai dataset, comprising 220,251 ECG recordings annotated by medical experts, and compared with six recent state-of-the-art methods. Ablation studies are conducted to identify key components contributing to the model's performance. Additionally, the method is evaluated on two public datasets to assess its broader applicability.
Results: Experiments show that the proposed method increases the macro F1 scores by 2.8%-28.6% on the Fuwai dataset, 10.4%-46.2% on the multicenter dataset and 19.1%-46.9% on the PTB-XL dataset for common ECG diagnoses recognition tasks, compared to six alternative methods. Additionally, the proposed method consistently achieves state-of-the-art performance on the PTB-XL superclass task in both linear probing and fine-tuning evaluations. The masked pre-training strategy significantly enhances classification performance, with key contributing factors including the masking ratio, training schedule length, fluctuating reconstruction targets, layer-wise learning rate decay, and DropPath rate.
Conclusion: The Masked Transformer model exhibits superior performance in ECG classification, highlighting its potential to advance ECG-based diagnostic systems.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.compbiomed.2025.110674 | DOI Listing |