1,119 research outputs found
Nitric Oxide Bioavailability and Its Potential Relevance to the Variation in Susceptibility to the Renal and Vascular Complications in Patients With Type 2 Diabetes
OBJECTIVE—We compared the renal and systemic vascular (renovascular) response to a reduction of bioavailable nitric oxide (NO) in type 2 diabetic patients without nephropathy and of African and Caucasian heritage. RESEARCH DESIGN AND METHODS—Under euglycemic conditions, renal blood flow was determined by a constant infusion of paraminohippurate and changes in blood pressure and renal vascular resistance estimated before and after an infusion of l-Ng-monomethyl-l-arginine. RESULTS—In the African-heritage group, there was a significant fall in renal blood flow (Δ−46.0 ml/min per 1.73 m(2); P < 0.05) and rise in systolic blood pressure (Δ10.0 mmHg [95% CI 2.3–17.9]; P = 0.017), which correlated with an increase in renal vascular resistance (r(2) = 0.77; P = 0.004). CONCLUSIONS—The renal vasoconstrictive response associated with NO synthase inhibition in this study may be of relevance to the observed vulnerability to renal injury in patients of African heritage
An NGO-Implemented Community-Clinic Health Worker Approach to Providing Long-Term Care for Hypertension in a Remote Region of Southern India.
Poor blood pressure control results in tremendous morbidity and mortality in India where the leading cause of death among adults is from coronary heart disease. Despite having little formal education, community health workers (CHWs) are integral to successful public health interventions in India and other low- and middle-income countries that have a shortage of trained health professionals. Training CHWs to screen for and manage chronic hypertension, with support from trained clinicians, offers an excellent opportunity for effecting systemwide change in hypertension-related burden of disease. In this article, we describe the development of a program that trained CHWs between 2014 and 2015 in the tribal region of the Sittilingi Valley in southern India, to identify hypertensive patients in the community, refer them for diagnosis and initial management in a physician-staffed clinic, and provide them with sustained lifestyle interventions and medications over multiple visits. We found that after 2 years, the CHWs had screened 7,176 people over age 18 for hypertension, 1,184 (16.5%) of whom were screened as hypertensive. Of the 1,184 patients screened as hypertensive, 898 (75.8%) had achieved blood pressure control, defined as a systolic blood pressure less than 140 and a diastolic blood pressure less than 90 sustained over 3 consecutive visits. While all of the 24 trained CHWs reported confidence in checking blood pressure with a manual blood pressure cuff, 4 of the 24 CHWs reported occasional difficulty documenting blood pressure values because they were unable to write numbers properly. They compensated by asking other CHWs or members of their community to help with documentation. Our experience and findings suggest that a CHW blood pressure screening system linked to a central clinic can be a promising avenue for improving hypertension control rates in low- and middle-income countries
On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics
Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model
Testing the Substrate-Envelope Hypothesis with Designed Pairs of Compounds
Acquired resistance to therapeutic agents is a significant barrier to the development of clinically effective treatments for diseases in which evolution occurs on clinical time scales, frequently arising from target mutations. We previously reported a general strategy to design effective inhibitors for rapidly mutating enzyme targets, which we demonstrated for HIV-1 protease inhibition [Altman et al. J. Am. Chem. Soc. 2008, 130, 6099–6113]. Specifically, we developed a computational inverse design procedure with the added constraint that designed inhibitors bind entirely inside the substrate envelope, a consensus volume occupied by natural substrates. The rationale for the substrate-envelope constraint is that it prevents designed inhibitors from making interactions beyond those required by substrates and thus limits the availability of mutations tolerated by substrates but not by designed inhibitors. The strategy resulted in subnanomolar inhibitors that bind robustly across a clinically derived panel of drug-resistant variants. To further test the substrate-envelope hypothesis, here we have designed, synthesized, and assayed derivatives of our original compounds that are larger and extend outside the substrate envelope. Our designs resulted in pairs of compounds that are very similar to one another, but one respects and one violates the substrate envelope. The envelope-respecting inhibitor demonstrates robust binding across a panel of drug-resistant protease variants, whereas the envelope-violating one binds tightly to wild type but loses affinity to at least one variant. This study provides strong support for the substrate-envelope hypothesis as a design strategy for inhibitors that reduce susceptibility to resistance mutations.National Science Foundation (U.S.) (NSF grant 0821391)National Institute of General Medical Sciences (U.S.) (NIH (GM066524))National Institute of General Medical Sciences (U.S.) (GM065418)National Institute of General Medical Sciences (U.S.) (the NIH (GM082209)National Institute of General Medical Sciences (U.S.) (AI41404)National Institute of General Medical Sciences (U.S.) (AI43198
Interferometric imaging with the 32 element Murchison Wide-field Array
The Murchison Wide-field Array (MWA) is a low frequency radio telescope,
currently under construction, intended to search for the spectral signature of
the epoch of re-ionisation (EOR) and to probe the structure of the solar
corona. Sited in Western Australia, the full MWA will comprise 8192 dipoles
grouped into 512 tiles, and be capable of imaging the sky south of 40 degree
declination, from 80 MHz to 300 MHz with an instantaneous field of view that is
tens of degrees wide and a resolution of a few arcminutes. A 32-station
prototype of the MWA has been recently commissioned and a set of observations
taken that exercise the whole acquisition and processing pipeline. We present
Stokes I, Q, and U images from two ~4 hour integrations of a field 20 degrees
wide centered on Pictoris A. These images demonstrate the capacity and
stability of a real-time calibration and imaging technique employing the
weighted addition of warped snapshots to counter extreme wide field imaging
distortions.Comment: Accepted for publication in PASP. This is the draft before journal
typesetting corrections and proofs so does contain formatting and journal
style errors, also has with lower quality figures for space requirement
The Murchison Widefield Array
It is shown that the excellent Murchison Radio-astronomy Observatory site
allows the Murchison Widefield Array to employ a simple RFI blanking scheme and
still calibrate visibilities and form images in the FM radio band. The
techniques described are running autonomously in our calibration and imaging
software, which is currently being used to process an FM-band survey of the
entire southern sky.Comment: Accepted for publication in Proceedings of Science [PoS(RFI2010)016].
6 pages and 3 figures. Presented at RFI2010, the Third Workshop on RFI
Mitigation in Radio Astronomy, 29-31 March 2010, Groningen, The Netherland
A Process for Co-Designing Educational Technology Systems for Refugee Children
There is a growing interest in the potential for technology to facilitate emergency education of refugee children. However, designing in this space requires knowledge of the displaced population and the contextual dynamics surrounding it. Design should therefore be informed by both existing research across relevant disciplines, and from the practical experience of those who are on the ground facing the problem in real life. This paper describes a process for designing appropriate technology for these settings. The process draws on literature from emergency education, student engagement and motivation, educational technology, and participatory design. We emphasise a thorough understanding of the problem definition, the nature of the emergency, and of socio-cultural aspects that can inform the design process. We describe how this process was implemented leading to the design of a digital learning space for children living in a refugee camp in Greece. This drew on involving different groups of participants such as social-workers, parents, and children
The Murchison Widefield Array: Design Overview
The Murchison Widefield Array (MWA) is a dipole-based aperture array
synthesis telescope designed to operate in the 80-300 MHz frequency range. It
is capable of a wide range of science investigations, but is initially focused
on three key science projects. These are detection and characterization of
3-dimensional brightness temperature fluctuations in the 21cm line of neutral
hydrogen during the Epoch of Reionization (EoR) at redshifts from 6 to 10,
solar imaging and remote sensing of the inner heliosphere via propagation
effects on signals from distant background sources,and high-sensitivity
exploration of the variable radio sky. The array design features 8192
dual-polarization broad-band active dipoles, arranged into 512 tiles comprising
16 dipoles each. The tiles are quasi-randomly distributed over an aperture
1.5km in diameter, with a small number of outliers extending to 3km. All
tile-tile baselines are correlated in custom FPGA-based hardware, yielding a
Nyquist-sampled instantaneous monochromatic uv coverage and unprecedented point
spread function (PSF) quality. The correlated data are calibrated in real time
using novel position-dependent self-calibration algorithms. The array is
located in the Murchison region of outback Western Australia. This region is
characterized by extremely low population density and a superbly radio-quiet
environment,allowing full exploitation of the instrumental capabilities.Comment: 9 pages, 5 figures, 1 table. Accepted for publication in Proceedings
of the IEE
Cluster Lenses
Clusters of galaxies are the most recently assembled, massive, bound
structures in the Universe. As predicted by General Relativity, given their
masses, clusters strongly deform space-time in their vicinity. Clusters act as
some of the most powerful gravitational lenses in the Universe. Light rays
traversing through clusters from distant sources are hence deflected, and the
resulting images of these distant objects therefore appear distorted and
magnified. Lensing by clusters occurs in two regimes, each with unique
observational signatures. The strong lensing regime is characterized by effects
readily seen by eye, namely, the production of giant arcs, multiple-images, and
arclets. The weak lensing regime is characterized by small deformations in the
shapes of background galaxies only detectable statistically. Cluster lenses
have been exploited successfully to address several important current questions
in cosmology: (i) the study of the lens(es) - understanding cluster mass
distributions and issues pertaining to cluster formation and evolution, as well
as constraining the nature of dark matter; (ii) the study of the lensed objects
- probing the properties of the background lensed galaxy population - which is
statistically at higher redshifts and of lower intrinsic luminosity thus
enabling the probing of galaxy formation at the earliest times right up to the
Dark Ages; and (iii) the study of the geometry of the Universe - as the
strength of lensing depends on the ratios of angular diameter distances between
the lens, source and observer, lens deflections are sensitive to the value of
cosmological parameters and offer a powerful geometric tool to probe Dark
Energy. In this review, we present the basics of cluster lensing and provide a
current status report of the field.Comment: About 120 pages - Published in Open Access at:
http://www.springerlink.com/content/j183018170485723/ . arXiv admin note:
text overlap with arXiv:astro-ph/0504478 and arXiv:1003.3674 by other author
Relationship between qualitative physics and fuzzy logic in natural subsystems
44-49The purpose of this research is to present a comparison between the two ad hoc appearance and control techniques of conceptual systems. In that respect, it is a description of the interconnected notion between the principle of qualitative physics and of ambiguous quality. On that basis the first point is to determine the key feature of each approach is significant. In the early stages of the product development and forecasting process, a large number of input energies were used for its creation. However, they are still being used in nature, though not subjectively impure. Therefore, this research presents the concept of the relationship between qualitative physics and fuzzy logic in terms of developing predictive outputs and using logical resources. Finally, the relationship between qualitative physics and fuzzy logic processes has been proven with the support of the selected natural subsystem
- …
