3,536 research outputs found
CRISPR/Cas9‐mediated genome editing: from basic research to translational medicine
The recent development of the CRISPR/Cas9 system as an efficient and accessible programmable genome-editing tool has revolutionized basic science research. CRISPR/Cas9 system-based technologies have armed researchers with new powerful tools to unveil the impact of genetics on disease development by enabling the creation of precise cellular and animal models of human diseases. The therapeutic potential of these technologies is tremendous, particularly in gene therapy, in which a patient-specific mutation is genetically corrected in order to treat human diseases that are untreatable with conventional therapies. However, the translation of CRISPR/Cas9 into the clinics will be challenging, since we still need to improve the efficiency, specificity and delivery of this technology. In this review, we focus on several in vitro, in vivo and ex vivo applications of the CRISPR/Cas9 system in human disease-focused research, explore the potential of this technology in translational medicine and discuss some of the major challenges for its future use in patients.Portuguese Foundation for Science and Technology:
UID/BIM/04773/2013
1334
Spanish Ministry of Science, Innovation and Universities
RTI2018-094629-B-I00
Portuguese Foundation for Science and Technology
SFRH/BPD/100434/2014
European Union (EU)
748585
LPCC-NRS/Terry Fox grantsinfo:eu-repo/semantics/publishedVersio
Parameter inference and model comparison using theoretical predictions from noisy simulations
When inferring unknown parameters or comparing different models, data must be
compared to underlying theory. Even if a model has no closed-form solution to
derive summary statistics, it is often still possible to simulate mock data in
order to generate theoretical predictions. For realistic simulations of noisy
data, this is identical to drawing realizations of the data from a likelihood
distribution. Though the estimated summary statistic from simulated data
vectors may be unbiased, the estimator has variance which should be accounted
for. We show how to correct the likelihood in the presence of an estimated
summary statistic by marginalizing over the true summary statistic in the
framework of a Bayesian hierarchical model. For Gaussian likelihoods where the
covariance must also be estimated from simulations, we present an alteration to
the Sellentin-Heavens corrected likelihood. We show that excluding the proposed
correction leads to an incorrect estimate of the Bayesian evidence with JLA
data. The correction is highly relevant for cosmological inference that relies
on simulated data for theory (e.g. weak lensing peak statistics and simulated
power spectra) and can reduce the number of simulations required.Comment: 9 pages, 6 figures, published by MNRAS. Changes: matches published
version, added Bayesian hierarchical interpretation and probabilistic
graphical mode
Radio Galaxy Detection in the Visibility Domain
We explore a new Bayesian method of detecting galaxies from radio
interferometric data of the faint sky. Working in the Fourier domain, we fit a
single, parameterised galaxy model to simulated visibility data of star-forming
galaxies. The resulting multimodal posterior distribution is then sampled using
a multimodal nested sampling algorithm such as MultiNest. For each galaxy, we
construct parameter estimates for the position, flux, scale-length and
ellipticities from the posterior samples. We first test our approach on
simulated SKA1-MID visibility data of up to 100 galaxies in the field of view,
considering a typical weak lensing survey regime (SNR ) where 98% of
the input galaxies are detected with no spurious source detections. We then
explore the low SNR regime, finding our approach reliable in galaxy detection
and providing in particular high accuracy in positional estimates down to SNR
. The presented method does not require transformation of visibilities
to the image domain, and requires no prior knowledge of the number of galaxies
in the field of view, thus could become a useful tool for constructing accurate
radio galaxy catalogs in the future.Comment: 11 pages, 11 figures. Accepted for publication in MNRA
Emulsion formation and stabilization by biomolecules: the leading role of cellulose
Emulsion stabilization by native cellulose has been mainly hampered because of its insolubility in water. Chemical modification is normally needed to obtain water-soluble cellulose derivatives. These modified celluloses have been widely used for a range of applications by the food, cosmetic, pharmaceutic, paint and construction industries. In most cases, the modified celluloses are used as rheology modifiers (thickeners) or as emulsifying agents. In the last decade, the structural features of cellulose have been revisited, with particular focus on its structural anisotropy (amphiphilicity) and the molecular interactions leading to its resistance to dissolution. The amphiphilic behavior of native cellulose is evidenced by its capacity to adsorb at the interface between oil and aqueous solvent solutions, thus being capable of stabilizing emulsions. In this overview, the fundamentals of emulsion formation and stabilization by biomolecules are briefly revisited before different aspects around the emerging role of cellulose as emulsion stabilizer are addressed in detail. Particular focus is given to systems stabilized by native cellulose, either molecularly-dissolved or not (Pickering-like effect).Financially support by the Portuguese Foundation for Science and Technology, FCT, via the projects PTDC/AGR-TEC/4814/2014, PTDC/ASP-SIL/30619/2017 and researcher grant IF/01005/2014. RISE Research Institutes of Sweden AB and PERFORM, a competence platform in Formulation Science at RISE, are acknowledged for additional financing. This research has been supported by
Treesearch.se.info:eu-repo/semantics/publishedVersio
Excess Clustering on Large Scales in the MegaZ DR7 Photometric Redshift Survey
We observe a large excess of power in the statistical clustering of luminous red galaxies in the photometric SDSS galaxy sample called MegaZ DR7. This is seen over the lowest multipoles in the angular power spectra C-l in four equally spaced redshift bins between 0: 45 <= z <= 0: 65. However, it is most prominent in the highest redshift band at similar to 4 sigma and it emerges at an effective scale k less than or similar to 0: 01 h Mpc(-1). Given that MegaZ DR7 is the largest cosmic volume galaxy survey to date (3.3(Gpch(-1))(3)) this implies an anomaly on the largest physical scales probed by galaxies. Alternatively, this signature could be a consequence of it appearing at the most systematically susceptible redshift. There are several explanations for this excess power that range from systematics to new physics. We test the survey, data, and excess power, as well as possible origins
Halo detection via large-scale Bayesian inference
We present a proof-of-concept of a novel and fully Bayesian methodology
designed to detect halos of different masses in cosmological observations
subject to noise and systematic uncertainties. Our methodology combines the
previously published Bayesian large-scale structure inference algorithm, HADES,
and a Bayesian chain rule (the Blackwell-Rao Estimator), which we use to
connect the inferred density field to the properties of dark matter halos. To
demonstrate the capability of our approach we construct a realistic galaxy mock
catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a
median redshift of approximately 0.05. Application of HADES to the catalogue
provides us with accurately inferred three-dimensional density fields and
corresponding quantification of uncertainties inherent to any cosmological
observation. We then use a cosmological simulation to relate the amplitude of
the density field to the probability of detecting a halo with mass above a
specified threshold. With this information we can sum over the HADES density
field realisations to construct maps of detection probabilities and demonstrate
the validity of this approach within our mock scenario. We find that the
probability of successful of detection of halos in the mock catalogue increases
as a function of the signal-to-noise of the local galaxy observations. Our
proposed methodology can easily be extended to account for more complex
scientific questions and is a promising novel tool to analyse the cosmic
large-scale structure in observations.Comment: 17 pages, 13 figures. Accepted for publication in MNRAS following
moderate correction
- …
