3,351 research outputs found
Swabs to genomes: A comprehensive workflow
© 2015 Dunitz et al. The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab) to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to performit
Draft genome sequence of Kocuria sp. strain UCD-OTCP (phylum Actinobacteria)
© 2013 Coil et al. Here, we present the draft genome of Kocuria sp. strain UCD-OTCP, a member of the phylum Actinobacteria, isolated from a restaurant chair cushion. The assembly contains 3,791,485 bp (G+C content of 73%) and is contained in 68 scaffolds
Quantifying sources of methane using light alkanes in the Los Angeles basin, California
Methane (CH4), carbon dioxide (CO2), carbon monoxide (CO), and C2-C5 alkanes were measured throughout the Los Angeles (L.A.) basin in May and June 2010. We use these data to show that the emission ratios of CH4/CO and CH4/CO2 in the L.A. basin are larger than expected from population-apportioned bottom-up state inventories, consistent with previously published work. We use experimentally determined CH4/CO and CH4/CO2 emission ratios in combination with annual State of California CO and CO2 inventories to derive a yearly emission rate of CH4 to the L.A. basin. We further use the airborne measurements to directly derive CH4 emission rates from dairy operations in Chino, and from the two largest landfills in the L.A. basin, and show these sources are accurately represented in the California Air Resources Board greenhouse gas inventory for CH4. We then use measurements of C2-C5 alkanes to quantify the relative contribution of other CH4 sources in the L.A. basin, with results differing from those of previous studies. The atmospheric data are consistent with the majority of CH4 emissions in the region coming from fugitive losses from natural gas in pipelines and urban distribution systems and/or geologic seeps, as well as landfills and dairies. The local oil and gas industry also provides a significant source of CH4 in the area. The addition of CH4 emissions from natural gas pipelines and urban distribution systems and/or geologic seeps and from the local oil and gas industry is sufficient to account for the differences between the top-down and bottom-up CH4 inventories identified in previously published work. Key PointsTop-down estimates of CH4 emissions in L.A. are greater than inventory estimatesEstimates of CH4 emissions from landfills in L.A. agree with CARB inventoryPipeline natural gas and/or seeps, and landfills are main sources of CH4 in L.A. ©2013. American Geophysical Union. All Rights Reserved
Revisiting protein aggregation as pathogenic in sporadic Parkinson and Alzheimer diseases.
The gold standard for a definitive diagnosis of Parkinson disease (PD) is the pathologic finding of aggregated α-synuclein into Lewy bodies and for Alzheimer disease (AD) aggregated amyloid into plaques and hyperphosphorylated tau into tangles. Implicit in this clinicopathologic-based nosology is the assumption that pathologic protein aggregation at autopsy reflects pathogenesis at disease onset. While these aggregates may in exceptional cases be on a causal pathway in humans (e.g., aggregated α-synuclein in SNCA gene multiplication or aggregated β-amyloid in APP mutations), their near universality at postmortem in sporadic PD and AD suggests they may alternatively represent common outcomes from upstream mechanisms or compensatory responses to cellular stress in order to delay cell death. These 3 conceptual frameworks of protein aggregation (pathogenic, epiphenomenon, protective) are difficult to resolve because of the inability to probe brain tissue in real time. Whereas animal models, in which neither PD nor AD occur in natural states, consistently support a pathogenic role of protein aggregation, indirect evidence from human studies does not. We hypothesize that (1) current biomarkers of protein aggregates may be relevant to common pathology but not to subgroup pathogenesis and (2) disease-modifying treatments targeting oligomers or fibrils might be futile or deleterious because these proteins are epiphenomena or protective in the human brain under molecular stress. Future precision medicine efforts for molecular targeting of neurodegenerative diseases may require analyses not anchored on current clinicopathologic criteria but instead on biological signals generated from large deeply phenotyped aging populations or from smaller but well-defined genetic-molecular cohorts
Sequential Deliberation for Social Choice
In large scale collective decision making, social choice is a normative study
of how one ought to design a protocol for reaching consensus. However, in
instances where the underlying decision space is too large or complex for
ordinal voting, standard voting methods of social choice may be impractical.
How then can we design a mechanism - preferably decentralized, simple,
scalable, and not requiring any special knowledge of the decision space - to
reach consensus? We propose sequential deliberation as a natural solution to
this problem. In this iterative method, successive pairs of agents bargain over
the decision space using the previous decision as a disagreement alternative.
We describe the general method and analyze the quality of its outcome when the
space of preferences define a median graph. We show that sequential
deliberation finds a 1.208- approximation to the optimal social cost on such
graphs, coming very close to this value with only a small constant number of
agents sampled from the population. We also show lower bounds on simpler
classes of mechanisms to justify our design choices. We further show that
sequential deliberation is ex-post Pareto efficient and has truthful reporting
as an equilibrium of the induced extensive form game. We finally show that for
general metric spaces, the second moment of of the distribution of social cost
of the outcomes produced by sequential deliberation is also bounded
A Revised Design for Microarray Experiments to Account for Experimental Noise and Uncertainty of Probe Response
Background
Although microarrays are analysis tools in biomedical research, they are known to yield noisy output that usually requires experimental confirmation. To tackle this problem, many studies have developed rules for optimizing probe design and devised complex statistical tools to analyze the output. However, less emphasis has been placed on systematically identifying the noise component as part of the experimental procedure. One source of noise is the variance in probe binding, which can be assessed by replicating array probes. The second source is poor probe performance, which can be assessed by calibrating the array based on a dilution series of target molecules. Using model experiments for copy number variation and gene expression measurements, we investigate here a revised design for microarray experiments that addresses both of these sources of variance.
Results
Two custom arrays were used to evaluate the revised design: one based on 25 mer probes from an Affymetrix design and the other based on 60 mer probes from an Agilent design. To assess experimental variance in probe binding, all probes were replicated ten times. To assess probe performance, the probes were calibrated using a dilution series of target molecules and the signal response was fitted to an adsorption model. We found that significant variance of the signal could be controlled by averaging across probes and removing probes that are nonresponsive or poorly responsive in the calibration experiment. Taking this into account, one can obtain a more reliable signal with the added option of obtaining absolute rather than relative measurements.
Conclusion
The assessment of technical variance within the experiments, combined with the calibration of probes allows to remove poorly responding probes and yields more reliable signals for the remaining ones. Once an array is properly calibrated, absolute quantification of signals becomes straight forward, alleviating the need for normalization and reference hybridizations
Predicting disease progression in progressive supranuclear palsy in multicenter clinical trials
INTRODUCTION: Clinical and MRI measurements can track disease progression in PSP, but many have not been extensively evaluated in multicenter clinical trials. We identified optimal measures to capture clinical decline and predict disease progression in multicenter PSP trials. METHODS: Longitudinal clinical rating scales, neuropsychological test scores, and volumetric MRI data from an international, phase 2/3 clinical trial of davunetide for PSP (intent to treat population, n = 303) were used to identify measurements with largest effect size, strongest correlation with clinical change, and best ability to predict dropout or clinical decline over one year as measured by PSP Rating Scale (PSPRS). RESULTS: Baseline cognition as measured by Repeatable Battery for Assessing Neuropsychological Status (RBANS) was associated with attrition, but had only a small effect. PSPRS and Clinical Global Impression (CGI) had the largest effect size for measuring change. Annual change in CGI, RBANS, color trails, and MRI midbrain and ventricular volumes were most strongly correlated with annual PSPRS and had the largest effect sizes for detecting annual change. At baseline, shorter disease duration, more severe depression, and lower performance on RBANS and executive function tests were associated with faster worsening of the PSPRS in completers. With dropouts included, SEADL, RBANS, and executive function tests had significant effect on PSPRS trajectory of change. CONCLUSION: Baseline cognitive status and mood influence the rate of disease progression in PSP. Multiple clinical, neuropsychological, and volumetric MRI measurements are sensitive to change over one year in PSP and appropriate for use in multicenter clinical trials
The role of TcdB and TccC subunits in secretion of the photorhabdus Tcd toxin complex
The Toxin Complex (TC) is a large multi-subunit toxin encoded by a range of bacterial pathogens. The best-characterized examples are from the insect pathogens Photorhabdus, Xenorhabdus and Yersinia. They consist of three large protein subunits, designated A, B and C that assemble in a 5:1:1 stoichiometry. Oral toxicity to a range of insects means that some have the potential to be developed as pest control technology. The three subunit proteins do not encode any recognisable export sequences and as such little progress has been made in understanding their secretion. We have developed heterologous TC production and secretion models in E. coli and used them to ascribe functions to different domains of the crucial B+C sub-complex. We have determined that the B and C subunits use a secretion mechanism that is either encoded by the proteins themselves or employ an as yet undefined system common to laboratory strains of E. coli. We demonstrate that both the N-terminal domains of the B and C subunits are required for secretion of the whole complex. We propose a model whereby the N-terminus of the C-subunit toxin exports the B+C sub-complex across the inner membrane while that of the B-subunit allows passage across the outer membrane. We also demonstrate that even in the absence of the B-subunit, that the C-subunit can also facilitate secretion of the larger A-subunit. The recognition of this novel export system is likely to be of importance to future protein secretion studies. Finally, the identification of homologues of B and C subunits in diverse bacterial pathogens, including Burkholderia and Pseudomonas, suggests that these toxins are likely to be important in a range of different hosts, including man
Dystonia and Parkinson's disease: What is the relationship?
Dystonia and Parkinson's disease are closely linked disorders sharing many pathophysiological overlaps. Dystonia can be seen in 30% or more of the patients suffering with PD and sometimes can precede the overt parkinsonism. The response of early dystonia to the introduction of dopamine replacement therapy (levodopa, dopamine agonists) is variable; dystonia commonly occurs in PD patients following levodopa initiation. Similarly, parkinsonism is commonly seen in patients with mutations in various DYT genes including those involved in the dopamine synthesis pathway. Pharmacological blockade of dopamine receptors can cause both tardive dystonia and parkinsonism and these movement disorders syndromes can occur in many other neurodegenerative, genetic, toxic and metabolic diseases. Pallidotomy in the past and currently deep brain stimulation largely involving the GPi are effective treatment options for both dystonia and parkinsonism. However, the physiological mechanisms underlying the response of these two different movement disorder syndromes are poorly understood. Interestingly, DBS for PD can cause dystonia such as blepharospasm and bilateral pallidal DBS for dystonia can result in features of parkinsonism. Advances in our understanding of these responses may provide better explanations for the relationship between dystonia and Parkinson's disease
Phylogeny of Bacterial and Archaeal Genomes Using Conserved Genes: Supertrees and Supermatrices
Over 3000 microbial (bacterial and archaeal) genomes have been made publically available to date, providing an unprecedented opportunity to examine evolutionary genomic trends and offering valuable reference data for a variety of other studies such as metagenomics. The utility of these genome sequences is greatly enhanced when we have an understanding of how they are phylogenetically related to each other. Therefore, we here describe our efforts to reconstruct the phylogeny of all available bacterial and archaeal genomes. We identified 24, single-copy, ubiquitous genes suitable for this phylogenetic analysis. We used two approaches to combine the data for the 24 genes. First, we concatenated alignments of all genes into a single alignment from which a Maximum Likelihood (ML) tree was inferred using RAxML. Second, we used a relatively new approach to combining gene data, Bayesian Concordance Analysis (BCA), as implemented in the BUCKy software, in which the results of 24 single-gene phylogenetic analyses are used to generate a "primary concordance" tree. A comparison of the concatenated ML tree and the primary concordance (BUCKy) tree reveals that the two approaches give similar results, relative to a phylogenetic tree inferred from the 16S rRNA gene. After comparing the results and the methods used, we conclude that the current best approach for generating a single phylogenetic tree, suitable for use as a reference phylogeny for comparative analyses, is to perform a maximum likelihood analysis of a concatenated alignment of conserved, single-copy genes. © 2013
- …
