Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 197
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 197
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 271
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3165
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 597
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 511
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 317
Function: require_once
98%
921
2 minutes
20
This study developed an effective approach for discriminating geographical origins of Duhuo samples using non-targeted UPLC chromatograms and UV-Vis spectrogram images combined with a two-dimensional convolution neural network (2D-CNN). For comparison, four machine learning methods-extreme gradient boosting (XGBoost), random forest (RF), partial least squares discriminant analysis (PLS-DA), and support vector machine (SVM) were applied to analyze UPLC, UV-Vis data matrix, and concentrations of seven target compounds. Enhanced by data augmentation, 2D-CNN demonstrated superior accuracy, with 98.28% accuracy for UV-Vis images and 100% for UPLC images, while traditional machine learning models showed considerable variation across datasets. These results demonstrate the integration of 2D-CNN with UPLC and UV-Vis images enable robustness, accurate and non-destructive analysis for the efficient discrimination of TCM samples. Specifically, UV-Vis spectroscopy provides a convenient method for quick detection. Overall, the employed approach offers a powerful tool for the precise and reliable analysis of herbal medicines.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.chroma.2025.466014 | DOI Listing |