227 research outputs found
Edge-functionalized and substitutional doped graphene nanoribbons: electronic and spin properties
Graphene nanoribbons are the counterpart of carbon nanotubes in
graphene-based nanoelectronics. We investigate the electronic properties of
chemically modified ribbons by means of density functional theory. We observe
that chemical modifications of zigzag ribbons can break the spin degeneracy.
This promotes the onset of a semiconducting-metal transition, or of an
half-semiconducting state, with the two spin channels having a different
bandgap, or of a spin-polarized half-semiconducting state -where the spins in
the valence and conduction bands are oppositely polarized. Edge
functionalization of armchair ribbons gives electronic states a few eV away
from the Fermi level, and does not significantly affect their bandgap. N and B
produce different effects, depending on the position of the substitutional
site. In particular, edge substitutions at low density do not significantly
alter the bandgap, while bulk substitution promotes the onset of
semiconducting-metal transitions. Pyridine-like defects induce a
semiconducting-metal transition.Comment: 12 pages, 5 figure
Optimal static pricing for a tree network
We study the static pricing problem for a network service provider in a loss system with a tree structure. In the network, multiple classes share a common inbound link and then have dedicated outbound links. The motivation is from a company that sells phone cards and needs to price calls to different destinations. We characterize the optimal static prices in order to maximize the steady-state revenue. We report new structural findings as well as alternative proofs for some known results. We compare the optimal static prices versus prices that are asymptotically optimal, and through a set of illustrative numerical examples we show that in certain cases the loss in revenue can be significant. Finally, we show that static prices obtained using the reduced load approximation of the blocking probabilities can be easily obtained and have near-optimal performance, which makes them more attractive for applications.Massachusetts Institute of Technology. Center for Digital BusinessUnited States. Office of Naval Research (Contract N00014-95-1-0232)United States. Office of Naval Research (Contract N00014-01-1-0146)National Science Foundation (U.S.) (Contract DMI-9732795)National Science Foundation (U.S.) (Contract DMI-0085683)National Science Foundation (U.S.) (Contract DMI-0245352
The current landscape of software tools for the climate-sensitive infectious disease modelling community
Climate-sensitive infectious disease modelling is crucial for public health planning and is underpinned by a complex network of software tools. We identified only 37 tools that incorporated both climate inputs and epidemiological information to produce an output of disease risk in one package, were transparently described and validated, were named (for future searching and versioning), and were accessible (ie, the code was published during the past 10 years or was available on a repository, web platform, or other user interface). We noted disproportionate representation of developers based at North American and European institutions. Most tools (n=30 [81%]) focused on vector-borne diseases, and more than half (n=16 [53%]) of these tools focused on malaria. Few tools (n=4 [11%]) focused on food-borne, respiratory, or water-borne diseases. The under-representation of tools for estimating outbreaks of directly transmitted diseases represents a major knowledge gap. Just over half (n=20 [54%]) of the tools assessed were described as operationalised, with many freely available online.Peer Reviewed"Article signat per 12 autors/es: Sadie J Ryan, PhD *
Catherine A Lippi, Talia Caplan, Avriel Diaz, Willy Dunbar, Shruti Grover, Simon Johnson, Rebecca Knowles, Prof Rachel Lowe, Bilal A Mateen, Madeleine C Thomson, Anna M Stewart-Ibarra"Postprint (published version
Circulating cell-free DNA levels measured by a novel simple fluorescent assay are predictive for outcome of severe sepsis
Pinedale Glacial History of the Upper Arkansas River Valley: New Moraine Chronologies, Modeling Results and Geologic Mapping
This field trip guidebook chapter outlines the glacial history of the upper Arkansas River valley, Colorado, and builds on a previous GSA field trip to the same area in 2010. The following will be presented: (1) new cosmogenic 10Be exposure ages of moraine boulders from the Pinedale and Bull Lake glaciations (Marine Isotope Stages 2 and 6, respectively) located adjacent to the Twin Lakes Reservoir, (2) numerical modeling of glaciers during the Pinedale glaciation in major tributaries draining into the upper Arkansas River, (3) discharge estimates for glacial-lake outburst floods in the upper Arkansas River valley, and (4) 10Be ages on flood boulders deposited downvalley from the moraine sequences. This research was stimulated by a new geologic map of the Granite 7.5’ quadrangle, in which the mapping of surficial deposits was revised based in part on the interpretation of newly acquired LiDAR data and field investigations. The new 10Be ages of the Pinedale terminal moraine at Twin Lakes average 21.8 ± 0.7 ka (n=14), which adds to nearby Pinedale terminal moraine ages of 23.6 ± 1.4 ka (n=5), 20.5 ± 0.2 ka (n=3) and 16.6 ± 1.0 ka, and downvalley outburst flood terraces that date to 20.9 ± 0.9 ka (n=4) and 19.0 ± 0.6 ka (n=4). This growing chronology leads to improved understanding of the controls and timing of glaciation in the western U.S., the modeling of glacial-lake outburst flooding, and the reconstruction of paleo-temperature through glacier modeling
Nanohertz Frequency Determination for the Gravity Probe B HF SQUID Signal
In this paper, we present a method to measure the frequency and the frequency
change rate of a digital signal. This method consists of three consecutive
algorithms: frequency interpolation, phase differencing, and a third algorithm
specifically designed and tested by the authors. The succession of these three
algorithms allowed a 5 parts in 10^10 resolution in frequency determination.
The algorithm developed by the authors can be applied to a sampled scalar
signal such that a model linking the harmonics of its main frequency to the
underlying physical phenomenon is available. This method was developed in the
framework of the Gravity Probe B (GP-B) mission. It was applied to the High
Frequency (HF) component of GP-B's Superconducting QUantum Interference Device
(SQUID) signal, whose main frequency fz is close to the spin frequency of the
gyroscopes used in the experiment. A 30 nHz resolution in signal frequency and
a 0.1 pHz/sec resolution in its decay rate were achieved out of a succession of
1.86 second-long stretches of signal sampled at 2200 Hz. This paper describes
the underlying theory of the frequency measurement method as well as its
application to GP-B's HF science signal.Comment: The following article has been submitted to Review of Scientific
Instruments. After it is published, it will be found at (http://rsi.aip.org/
Cost-based domain filtering for stochastic constraint programming
Cost-based filtering is a novel approach that combines techniques from Operations Research and Constraint Programming to filter from decision variable domains values that do not lead to better solutions [7]. Stochastic Constraint Programming is a framework for modeling combinatorial optimization problems that involve uncertainty [9]. In this work, we show how to perform cost-based filtering for certain classes of stochastic constraint programs. Our approach is based on a set of known inequalities borrowed from Stochastic Programming ¿ a branch of OR concerned with modeling and solving problems involving uncertainty. We discuss bound generation and cost-based domain filtering procedures for a well-known problem in the Stochastic Programming literature, the static stochastic knapsack problem. We also apply our technique to a stochastic sequencing problem. Our results clearly show the value of the proposed approach over a pure scenario-based Stochastic Constraint Programming formulation both in terms of explored nodes and run time
Co-learning during the co-creation of a dengue early warning system for the health sector in Barbados
Over the past decade, the Caribbean region has been challenged by compound climate and health hazards, including tropical storms, extreme heat and droughts and overlapping epidemics of mosquito-borne diseases, including dengue, chikungunya and Zika. Early warning systems (EWS) are a key climate change adaptation strategy for the health sector. An EWS can integrate climate information in forecasting models to predict the risk of disease outbreaks several weeks or months in advance. In this article, we share our experiences of co-learning during the process of co-creating a dengue EWS for the health sector in Barbados, and we discuss barriers to implementation as well as key opportunities. This process has involved bringing together health and climate practitioners with transdisciplinary researchers to jointly identify needs and priorities, assess available data, co-create an early warning tool, gather feedback via national and regional consultations and conduct trainings. Implementation is ongoing and our team continues to be committed to a long-term process of collaboration. Developing strong partnerships, particularly between the climate and health sectors in Barbados, has been a critical part of the research and development. In many countries, the national climate and health sectors have not worked together in a sustained or formal manner. This collaborative process has purposefully pushed us out of our comfort zone, challenging us to venture beyond our institutional and disciplinary silos. Through the co-creation of the EWS, we anticipate that the Barbados health system will be better able to mainstream climate information into decision-making processes using tailored tools, such as epidemic forecast reports, risk maps and climate-health bulletins, ultimately increasing the resilience of the health system
An integrated approach to a combinatorial optimisation problem
Funding: MRC grant MR/S003819/1 and Health Data Research UK, an initiative funded by UK Research and Innovation, Department of Health and Social Care (England) and the devolved administrations, and leading medical research charities.We take inspiration from a problem from the healthcare domain, where patients with several chronic conditions follow different guidelines designed for the individual conditions, and where the aim is to find the best treatment plan for a patient that avoids adverse drug reactions, respects patient’s preferences and prioritises drug efficacy. Each chronic condition guideline can be abstractly described by a directed graph, where each node indicates a treatment step (e.g., a choice in medications or resources) and has a certain duration. The search for the best treatment path is seen as a combinatorial optimisation problem and we show how to select a path across the graphs constrained by a notion of resource compatibility. This notion takes into account interactions between any finite number of resources, and makes it possible to express non-monotonic interactions. Our formalisation also introduces a discrete temporal metric, so as to consider only simultaneous nodes in the optimisation process. We express the formal problem as an SMT problem and provide a correctness proof of the SMT code by exploiting the interplay between SMT solvers and the proof assistant Isabelle/HOL. The problem we consider combines aspects of optimal graph execution and resource allocation, showing how an SMT solver can be an alternative to other approaches which are well-researched in the corresponding domains.Postprin
- …
