1,295 research outputs found
Deductive semiparametric estimation in Double-Sampling Designs with application to PEPFAR
Non-ignorable dropout is common in studies with long follow-up time, and it
can bias study results unless handled carefully. A double-sampling design
allocates additional resources to pursue a subsample of the dropouts and find
out their outcomes, which can address potential biases due to non-ignorable
dropout. It is desirable to construct semiparametric estimators for the
double-sampling design because of their robustness properties. However,
obtaining such semiparametric estimators remains a challenge due to the
requirement of the analytic form of the efficient influence function (EIF), the
derivation of which can be ad hoc and difficult for the double-sampling design.
Recent work has shown how the derivation of EIF can be made deductive and
computerizable using the functional derivative representation of the EIF in
nonparametric models. This approach, however, requires deriving the mixture of
a continuous distribution and a point mass, which can itself be challenging for
complicated problems such as the double-sampling design. We propose
semiparametric estimators for the survival probability in double-sampling
designs by generalizing the deductive and computerizable estimation approach.
In particular, we propose to build the semiparametric estimators based on a
discretized support structure, which approximates the possibly continuous
observed data distribution and circumvents the derivation of the mixture
distribution. Our approach is deductive in the sense that it is expected to
produce semiparametric locally efficient estimators within finite steps without
knowledge of the EIF. We apply the proposed estimators to estimating the
mortality rate in a double-sampling design component of the President's
Emergency Plan for AIDS Relief (PEPFAR) program. We evaluate the impact of
double-sampling selection criteria on the mortality rate estimates
Polydesigns and Causal Inference
In an increasingly common class of studies, the goal is to evaluate causal effects of treatments that are only partially controlled by the investigator. In such studies there are two conflicting features: (1) a model on the full cohort design and data can identify the causal effects of interest, but can be sensitive to extreme regions of that design\u27s data, where model specification can have more impact; and (2) models on a reduced design (i.e., a subset of the full data), e.g., conditional likelihood on matched subsets of data, can avoid such sensitivity, but do not generally identify the causal effects. We propose a framework to assess how inference is sensitive to designs by exploring combinations of both the full and reduced designs. We show that using such a polydesign framework generates a rich class of methods that can identify causal effects and that can also be more robust to model specification than methods using only the full design. We discuss implementation of polydesign methods, and provide an illustration in the evaluation of a Needle Exchange Program
Tuning the electrical conductivity of Pt-containing granular metals by postgrowth electron irradiation
We have fabricated Pt-containing granular metals by focused electron beam
induced deposition from the precursor gas. The granular
metals are made of platinum nanocrystallites embedded in a carbonaceous matrix.
We have exposed the as-grown nanocomposites to low energy electron beam
irradiation and we have measured the electrical conductivity as a function of
the irradiation dose. Postgrowth electron beam irradiation transforms the
matrix microstructure and thus the strength of the tunneling coupling between
Pt nanocrystallites. For as-grown samples (weak tunnel coupling regime) we find
that the temperature dependence of the electrical conductivity follows the
stretched exponential behavior characteristic of the correlated variable-range
hopping transport regime. For briefly irradiated samples (strong tunnel
coupling regime) the electrical conductivity is tuned across the
metal-insulator transition. For long-time irradiated samples the electrical
conductivity behaves like that of a metal. In order to further analyze changes
of the microstructure as a function of the electron irradiation dose we have
carried out transmission electron microscope (TEM), micro-Raman and atomic
force microscopy (AFM) investigations. TEM pictures reveal that the
crystallites' size of long-time irradiated samples is larger than that of
as-grown samples. Furthermore we do not have evidence of microstructural
changes in briefly irradiated samples. By means of micro-Raman we find that by
increasing the irradiation dose the matrix changes following a graphitization
trajectory between amorphous carbon and nanocrystalline graphite. Finally, by
means of AFM measurements we observe a reduction of the volume of the samples
with increasing irradiation time which we attribute to the removal of carbon
molecules
Designs in Partially Controlled Studies: Messages from a Review
The ability to evaluate effects of factors on outcomes is increasingly important for a class of studies that control some but not all of the factors. Although important advances have been made in methods of analysis for such partially controlled studies,work on designs for such studies has been relatively limited. To help understand why, we review main designs that have been used for such partially controlled studies. Based on the review, we give two complementary reasons that explain the limited work on such designs, and suggest a new direction in this area
3D ultrastructural organization of whole Chlamydomonas reinhardtii cells studied by nanoscale soft x-ray tomography
The complex architecture of their structural elements and compartments is a hallmark of eukaryotic cells. The creation of high resolution models of whole cells has been limited by the relatively low resolution of conventional light microscopes and the requirement for ultrathin sections in transmission electron microscopy. We used soft x-ray tomography to study the 3D ultrastructural organization of whole cells of the unicellular green alga Chlamydomonas reinhardtii at unprecedented spatial resolution. Intact frozen hydrated cells were imaged using the natural x-ray absorption contrast of the sample without any staining. We applied different fiducial-based and fiducial-less alignment procedures for the 3D reconstructions. The reconstructed 3D volumes of the cells show features down to 30 nm in size. The whole cell tomograms reveal ultrastructural details such as nuclear envelope membranes, thylakoids, basal apparatus, and flagellar microtubule doublets. In addition, the x-ray tomograms provide quantitative data from the cell architecture. Therefore, nanoscale soft x-ray tomography is a new valuable tool for numerous qualitative and quantitative applications in plant cell biology
Deductive Derivation and Computerization of Compatible Semiparametric Efficient Estimation
Researchers often seek robust inference for a parameter through semiparametric estimation. Efficient semiparametric estimation currently requires theoretical derivation of the efficient influence function (EIF), which can be a challenging and time-consuming task. If this task can be computerized, it can save dramatic human effort, which can be transferred, for example, to the design of new studies. Although the EIF is, in principle, a derivative, simple numerical differentiation to calculate the EIF by a computer masks the EIF\u27s functional dependence on the parameter of interest. For this reason, the standard approach to obtaining the EIF has been the theoretical construction of the space of scores under all possible parametric submodels. This process currently depends on the correctness of conjectures about these spaces, and the correct verification of such conjectures. The correct guessing of such conjectures, though successful in some problems, is a nondeductive process, i.e., is not guaranteed to succeed (e.g., is not computerizable), and the verification of conjectures is generally susceptible to mistakes. We propose a method that can deductively produce semiparametric locally efficient estimators. The proposed method is computerizable, meaning that it does not need either conjecturing for, or otherwise theoretically deriving the functional form of the EIF, and is guaranteed to produce the result. The method is demonstared through an example
Choosing profile double-sampling designs for survival estimation with application to PEPFAR evaluation
Most studies that follow subjects over time are challenged by having some subjects who dropout. Double sampling is a design that selects and devotes resources to intensively pursue and find a subset of these dropouts, then uses data obtained from these to adjust naïve estimates, which are potentially biased by the dropout. Existing methods to estimate survival from double sampling assume a random sample. In limited-resource settings, however, generating accurate estimates using a minimum of resources is important. We propose using double-sampling designs that oversample certain profiles of dropouts as more efficient alternatives to random designs. First, we develop a framework to estimate the survival function under these profile double-sampling designs. We then derive the precision of these designs as a function of the rule for selecting different profiles, in order to identify more efficient designs. We illustrate using data from the United States President's Emergency Plan for AIDS Relief-funded HIV care and treatment program in western Kenya. Our results show why and how more efficient designs should oversample patients with shorter dropout times. Further, our work suggests generalizable practice for more efficient double-sampling designs, which can help maximize efficiency in resource-limited settings
PRINCIPAL STRATIFICATION DESIGNS TO ESTIMATE INPUT DATA MISSING DUE TO DEATH
We consider studies of cohorts of individuals after a critical event, such as an injury, with the following characteristics. First, the studies are designed to measure “input” variables, which describe the period before the critical event, and to characterize the distribution of the input variables in the cohort. Second, the studies are designed to measure “output” variables, primarily mortality after the critical event, and to characterize the predictive (conditional) distribution of mortality given the input variables in the cohort. Such studies often possess the complication that the input data are missing for those who die shortly after the critical event because the data collection takes place after the event. Standard methods of dealing with the missing inputs, such as imputation or weighting methods based on an assumption of ignorable missingness, are known to be generally invalid when the missingness of inputs is nonignorable, that is, when the distribution of the inputs is different between those who die and those who live. To address this issue, we propose a novel design that obtains and uses information on an additional key variable – a treatment or externally controlled variable, which if set at its “effective” level, could have prevented the death of those who died. We show that the new design can be used to draw valid inferences for the marginal distribution of inputs in the entire cohort, and for the conditional distribution of mortality given the inputs, also in the entire cohort, even under nonignorable missingness. The crucial framework that we use is principal stratification based on the potential outcomes, here mortality under both levels of treatment. We also show using illustrative preliminary injury data, that our approach can reveal results that are more reasonable than the results of standard methods, in relatively dramatic ways. Thus, our approach suggests that the routine collection of data on variables that could be used as possible treatments in such studies of inputs and mortality should become common
Estimating effects by combining instrumental variables with case-control designs: the role of principal stratification
The instrumental variable framework is commonly used in the estimation of causal effects from cohort samples. In the case of more efficient designs such as the case-control study, however, the combination of the instrumental variable and complex sampling designs requires new methodological consideration. As the prevalence of Mendelian randomization studies is increasing and the cost of genotyping and expression data can be high, the analysis of data gathered from more cost-effective sampling designs is of prime interest. We show that the standard instrumental variable analysis is not applicable to the case-control design and can lead to erroneous estimation and inference. We also propose a method based on principal stratification for the analysis of data arising from the combination of case-control sampling and instrumental variable design and illustrate it with a study in oncology
- …
