Importance of specimen pretreatment for the low-level detection of mycobacterial lipoarabinomannan in human serum.

Analyst

The Nano Institute of Utah, University of Utah, Salt Lake City, UT 84112, USA. and Department of Chemistry, University of Utah, Salt Lake City, UT 84112, USA and Department of Bioengineering, University of Utah, Salt Lake City, UT 84112, USA and Department of Pathology, Universi

Published: December 2016


Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Patient care and prevention of disease outbreaks rely heavily on the performance of diagnostic tests. These tests are typically carried out in serum, urine, and other complex sample matrices, but are often plagued by a number of matrix effects such as nonspecific adsorption and complexation with circulating proteins. This paper demonstrates the importance of sample pretreatment to overcome matrix effects, enabling the low-level detection of a disease marker for tuberculosis (TB). The impact of pretreatment is illustrated by detecting a cell wall component unique to mycobacteria, lipoarabinomannan (LAM). LAM is a major virulence factor in the infectious pathology of Mycobacterium tuberculosis (Mtb) and has been successfully detected in the body fluids of TB-infected individuals; however, its clinical sensitivity - identifying patients with active infection - remains problematic. This and the companion paper show that the detection of LAM in an immunoassay is plagued by its complexation with proteins and other components in serum. Herein, we present the procedures and results from an investigation of several different pretreatment schemes designed to disrupt complexation and thereby improve detection. These sample pretreatment studies, aimed at determining the optimal conditions for complex disruption, were carried out by using a LAM simulant derived from the nonpathogenic M. smegmatis, a mycobacterium often used as a model for Mtb. We have found that a perchloric acid-based pretreatment step improves the ability to detect this simulant by ∼1500× with respect to that in untreated serum. This paper describes the approach to pretreatment, how pretreatment improves the detection of the LAM simulant in human serum, and the results from a preliminary investigation to identify possible contributors to complexation by fractionating serum according to molecular weight. The companion paper applies this pretreatment approach to assays of TB patient samples.

Download full-text PDF

Source
http://dx.doi.org/10.1039/c6an02109cDOI Listing

Publication Analysis

Top Keywords

low-level detection
8
human serum
8
matrix effects
8
pretreatment
8
sample pretreatment
8
companion paper
8
detection lam
8
lam simulant
8
serum
6
detection
5

Similar Publications

Introduction: Spatial hearing enables both voluntary localization of sound sources and automatic monitoring of the surroundings. The auditory looming bias (ALB), characterized by the prioritized processing of approaching (looming) sounds over receding ones, is thought to serve as an early hazard detection mechanism. The bias could theoretically reflect an adaptation to the low-level acoustic properties of approaching sounds, or alternatively necessitate the sound to be localizable in space.

View Article and Find Full Text PDF

Background: Polymerase chain reaction (PCR)-based Minimal residual disease (MRD) detection is commonly used for core-binding factor acute myeloid leukemia (CBF-AML), but its interpretation in the context of allogeneic hematopoietic stem cell transplantation (allo-HSCT) remains under discussion.

Method: Using Kyoto Stem Cell Transplantation Group registry data, we included 96 patients who underwent allo-HSCT between 2000 and 2019 for CBF-AML.

Results: To assess MRD, quantitative PCR with GAPDH control was most used.

View Article and Find Full Text PDF

Tetrabromobisphenol A (TBBPA), a widely used flame retardant in textiles and electronics, poses toxicological risks through both environmental and indoor exposures. Biomonitoring studies have detected significant TBBPA levels in prenatal environments, including cord blood, raising concerns about developmental impacts. Using zebrafish as a model, this study addresses critical gaps in understanding how developmental TBBPA exposures perturb regulatory pathways that govern dorsoventral patterning.

View Article and Find Full Text PDF

Introduction: Low-level viremia (LLV) in HIV infection, defined as detectable but low plasma viral load, is associated with an increased risk of virological failure (VF); however, the mechanisms underlying LLV remain unclear. Monocytes, as potential viral reservoirs, can migrate into tissues and differentiate into tissue-resident macrophage reservoirs, playing a critical role in viral dissemination and potentially driving persistent viremia.

Methods: This study aimed to analyze and compare the molecular characteristics of near-full-length HIV-1 proviral DNA quasispecies from monocytes in three distinct virological response groups: VF, LLV, and virological suppression (VS).

View Article and Find Full Text PDF

Over 60 % of kidney transplant candidates are non-sensitised while remaining 40 % are sensitised because of previous exposure to human alloantigens during previous transplants, blood transfusions, and pregnancy in women. Pre-transplant compatibility testing is mandatory prior to renal transplantation for detecting the presence of donor-specific antibodies (DSAs), which are associated with early hyperacute/acute and later chronic rejections. Initially, complement-dependent cytotoxicity crossmatch (CDCXM) was used as a traditional method for detecting preformed DSAs.

View Article and Find Full Text PDF