2,928 research outputs found
Fludarabine as a cost-effective adjuvant to enhance engraftment of human normal and malignant hematopoiesis in immunodeficient mice
There is still an unmet need for xenotransplantation models that efficiently recapitulate normal and malignant human hematopoiesis. Indeed, there are a number of strategies to generate humanized mice and specific protocols, including techniques to optimize the cytokine environment of recipient mice and drug alternatives or complementary to the standard conditioning regimens, that can be significantly modulated. Unfortunately, the high costs related to the use of sophisticated mouse models may limit the application of these models to studies that require an extensive experimental design. Here, using an affordable and convenient method, we demonstrate that the administration of fludarabine (FludaraTM) promotes the extensive and rapid engraftment of human normal hematopoiesis in immunodeficient mice. Quantification of human CD45+ cells in bone marrow revealed approximately a 102-fold increase in mice conditioned with irradiation plus fludarabine. Engrafted cells in the bone marrow included hematopoietic stem cells, as well as myeloid and lymphoid cells. Moreover, this model proved to be sufficient for robust reconstitution of malignant myeloid hematopoiesis, permitting primary acute myeloid leukemia cells to engraft as early as 8 weeks after the transplant. Overall, these results present a novel and affordable model for engraftment of human normal and malignant hematopoiesis in immunodeficient mice
Belowground DNA-based techniques: untangling the network of plant root interactions
Contains fulltext :
91591.pdf (publisher's version ) (Closed access)7 p
Design of a five-axis ultra-precision micro-milling machine—UltraMill. Part 1: Holistic design approach, design considerations and specifications
High-accuracy three-dimensional miniature components and microstructures are increasingly in demand in the sector of electro-optics, automotive, biotechnology, aerospace and information-technology industries. A rational approach to mechanical micro machining is to develop ultra-precision machines with small footprints. In part 1 of this two-part paper, the-state-of-the-art of ultra-precision machines with micro-machining capability is critically reviewed. The design considerations and specifications of a five-axis ultra-precision micro-milling machine—UltraMill—are discussed. Three prioritised design issues: motion accuracy, dynamic stiffness and thermal stability, formulate the holistic design approach for UltraMill. This approach has been applied to the development of key machine components and their integration so as to achieve high accuracy and nanometer surface finish
Evolutionary connectionism: algorithmic principles underlying the evolution of biological organisation in evo-devo, evo-eco and evolutionary transitions
The mechanisms of variation, selection and inheritance, on which evolution by natural selection depends, are not fixed over evolutionary time. Current evolutionary biology is increasingly focussed on understanding how the evolution of developmental organisations modifies the distribution of phenotypic variation, the evolution of ecological relationships modifies the selective environment, and the evolution of reproductive relationships modifies the heritability of the evolutionary unit. The major transitions in evolution, in particular, involve radical changes in developmental, ecological and reproductive organisations that instantiate variation, selection and inheritance at a higher level of biological organisation. However, current evolutionary theory is poorly equipped to describe how these organisations change over evolutionary time and especially how that results in adaptive complexes at successive scales of organisation (the key problem is that evolution is self-referential, i.e. the products of evolution change the parameters of the evolutionary process). Here we first reinterpret the central open questions in these domains from a perspective that emphasises the common underlying themes. We then synthesise the findings from a developing body of work that is building a new theoretical approach to these questions by converting well-understood theory and results from models of cognitive learning. Specifically, connectionist models of memory and learning demonstrate how simple incremental mechanisms, adjusting the relationships between individually-simple components, can produce organisations that exhibit complex system-level behaviours and improve the adaptive capabilities of the system. We use the term “evolutionary connectionism” to recognise that, by functionally equivalent processes, natural selection acting on the relationships within and between evolutionary entities can result in organisations that produce complex system-level behaviours in evolutionary systems and modify the adaptive capabilities of natural selection over time. We review the evidence supporting the functional equivalences between the domains of learning and of evolution, and discuss the potential for this to resolve conceptual problems in our understanding of the evolution of developmental, ecological and reproductive organisations and, in particular, the major evolutionary transitions
Impact Factor: outdated artefact or stepping-stone to journal certification?
A review of Garfield's journal impact factor and its specific implementation
as the Thomson Reuters Impact Factor reveals several weaknesses in this
commonly-used indicator of journal standing. Key limitations include the
mismatch between citing and cited documents, the deceptive display of three
decimals that belies the real precision, and the absence of confidence
intervals. These are minor issues that are easily amended and should be
corrected, but more substantive improvements are needed. There are indications
that the scientific community seeks and needs better certification of journal
procedures to improve the quality of published science. Comprehensive
certification of editorial and review procedures could help ensure adequate
procedures to detect duplicate and fraudulent submissions.Comment: 25 pages, 12 figures, 6 table
TRY plant trait database - enhanced coverage and open access
Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives
Predicting progression of mild cognitive impairment to dementia using neuropsychological data: a supervised learning approach using time windows
Background: Predicting progression from a stage of Mild Cognitive Impairment to dementia is a major pursuit in current research. It is broadly accepted that cognition declines with a continuum between MCI and dementia. As such, cohorts of MCI patients are usually heterogeneous, containing patients at different stages of the neurodegenerative process. This hampers the prognostic task. Nevertheless, when learning prognostic models, most studies use the entire cohort of MCI patients regardless of their disease stages. In this paper, we propose a Time Windows approach to predict conversion to dementia, learning with patients stratified using time windows, thus fine-tuning the prognosis regarding the time to conversion. Methods: In the proposed Time Windows approach, we grouped patients based on the clinical information of whether they converted (converter MCI) or remained MCI (stable MCI) within a specific time window. We tested time windows of 2, 3, 4 and 5 years. We developed a prognostic model for each time window using clinical and neuropsychological data and compared this approach with the commonly used in the literature, where all patients are used to learn the models, named as First Last approach. This enables to move from the traditional question "Will a MCI patient convert to dementia somewhere in the future" to the question "Will a MCI patient convert to dementia in a specific time window". Results: The proposed Time Windows approach outperformed the First Last approach. The results showed that we can predict conversion to dementia as early as 5 years before the event with an AUC of 0.88 in the cross-validation set and 0.76 in an independent validation set. Conclusions: Prognostic models using time windows have higher performance when predicting progression from MCI to dementia, when compared to the prognostic approach commonly used in the literature. Furthermore, the proposed Time Windows approach is more relevant from a clinical point of view, predicting conversion within a temporal interval rather than sometime in the future and allowing clinicians to timely adjust treatments and clinical appointments.FCT under the Neuroclinomics2 project [PTDC/EEI-SII/1937/2014, SFRH/BD/95846/2013]; INESC-ID plurianual [UID/CEC/50021/2013]; LASIGE Research Unit [UID/CEC/00408/2013
Study of decays to the final state and evidence for the decay
A study of decays is performed for the first time
using data corresponding to an integrated luminosity of 3.0
collected by the LHCb experiment in collisions at centre-of-mass energies
of and TeV. Evidence for the decay
is reported with a significance of 4.0 standard deviations, resulting in the
measurement of
to
be .
Here denotes a branching fraction while and
are the production cross-sections for and mesons.
An indication of weak annihilation is found for the region
, with a significance of
2.4 standard deviations.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://lhcbproject.web.cern.ch/lhcbproject/Publications/LHCbProjectPublic/LHCb-PAPER-2016-022.html,
link to supplemental material inserted in the reference
Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector
The inclusive and dijet production cross-sections have been measured for jets
containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass
energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The
measurements use data corresponding to an integrated luminosity of 34 pb^-1.
The b-jets are identified using either a lifetime-based method, where secondary
decay vertices of b-hadrons in jets are reconstructed using information from
the tracking detectors, or a muon-based method where the presence of a muon is
used to identify semileptonic decays of b-hadrons inside jets. The inclusive
b-jet cross-section is measured as a function of transverse momentum in the
range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet
cross-section is measured as a function of the dijet invariant mass in the
range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets
and the angular variable chi in two dijet mass regions. The results are
compared with next-to-leading-order QCD predictions. Good agreement is observed
between the measured cross-sections and the predictions obtained using POWHEG +
Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet
cross-section. However, it does not reproduce the measured inclusive
cross-section well, particularly for central b-jets with large transverse
momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final
version published in European Physical Journal
- …
