An optimal experimental design is a structured data collection plan aimed at maximizing the amount of information gathered. Determining an optimal experimental design, however, relies on the assumption that a predetermined model structure, relating the response and covariates, is known a priori. In practical scenarios, such as dose-response modeling, the form of the model representing the "true" relationship is frequently unknown, although there exists a finite set or pool of potential alternative models.
View Article and Find Full Text PDFBackground: Cleaning activities are critical in pharmaceutical manufacturing to prevent cross-contamination of Active Pharmaceutical Ingredients (APIs). Traditionally, cleaning validation protocols have focused on production lines. However, there is a growing trend toward extending these protocols to Quality Control (QC) laboratories, encompassing both glassware and stainless-steel equipment.
View Article and Find Full Text PDFPhase I clinical trials are the first-in-human studies that primarily focus on the safety profile of drugs. Traditionally, the primary aim of a phase I clinical trial is to establish the maximum tolerated dose and characterize the toxicity profile of the tested agents. As a secondary aim, some phase I studies also include studies to obtain preliminary efficacy information about the experimental agents.
View Article and Find Full Text PDFWe study optimal designs for clinical trials when the value of the response and its variance depend on treatment and covariates are included in the response model. Such designs are generalizations of Neyman allocation, commonly used in personalized medicine when external factors may have differing effects on the response depending on subgroups of patients. We develop theoretical results for D-, A-, E- and D-optimal designs and construct semidefinite programming (SDP) formulations that support their numerical computation.
View Article and Find Full Text PDFThe design of continuous thickeners and clarifiers is commonly based on the solid flux theory. Batch sedimentation experiments conducted with solid concentrations still provide useful information for their application. The construction of models for the velocity of settling allows the estimation of the flux of solids throughout time, which can, in turn, be used to find the area of the units required to achieve a given solid concentration in the clarified stream.
View Article and Find Full Text PDFThe paper studies randomization rules for a sequential two-treatment, two-site clinical trial in Parkinson's disease. An important feature is that we have values of responses and five potential prognostic factors from a sample of 144 patients similar to those to be enrolled in the trial. Analysis of this sample provides a model for trial analysis.
View Article and Find Full Text PDFCompanies regularly face market pressure to develop products faster but they also need to simultaneously incorporate technological constraints, sustainability trends, and customer requirements into their designs, which requires the use of systematic procedures. Firms that exploit natural resources and convert them into high-value products are among them. However, the literature on the application of such systematic approaches to products of this type remains scarce, as they often requrire extensive experimental plans involving the testing and optimization of multiple formulations.
View Article and Find Full Text PDFMath Biosci Eng
January 2023
The modeling of polymeric reactions is a topic of large interest. The gelation reactions that may result from self-crosslinking or hybrid (agent based-) crosslinking are examples with interest specially in biomaterials applications. The composition of polymer entities during the reaction is hard to follow, and their concentration is not a good measure of the system dynamics.
View Article and Find Full Text PDFLot Quality Assurance Sampling (LQAS) plans are widely used for health monitoring purposes. We propose a systematic approach to design multiple-objective LQAS plans that meet user-specified type 1 and 2 error rates and targets for selected diagnostic accuracy metrics. These metrics may include sensitivity, specificity, positive predictive value, and negative predictive value in high or low anticipated prevalence rate populations.
View Article and Find Full Text PDFChemometr Intell Lab Syst
February 2016
We use mathematical programming tools, such as Semidefinite Programming (SDP) and Nonlinear Programming (NLP)-based formulations to find optimal designs for models used in chemistry and chemical engineering. In particular, we employ local design-based setups in linear models and a Bayesian setup in nonlinear models to find optimal designs. In the latter case, Gaussian Quadrature Formulas (GQFs) are used to evaluate the optimality criterion averaged over the prior distribution for the model parameters.
View Article and Find Full Text PDFThis paper uses semidefinite programming (SDP) to construct Bayesian optimal design for nonlinear regression models. The setup here extends the formulation of the optimal designs problem as an SDP problem from linear to nonlinear models. Gaussian quadrature formulas (GQF) are used to compute the expectation in the Bayesian design criterion, such as D-, A- or E-optimality.
View Article and Find Full Text PDFT-optimum designs for model discrimination are notoriously difficult to find because of the computational difficulty involved in solving an optimization problem that involves two layers of optimization. Only a handful of analytical T-optimal designs are available for the simplest problems; the rest in the literature are found using specialized numerical procedures for a specific problem. We propose a potentially more systematic and general way for finding T-optimal designs using a Semi-Infinite Programming (SIP) approach.
View Article and Find Full Text PDF