892 research outputs found
Exponential dichotomies of evolution operators in Banach spaces
This paper considers three dichotomy concepts (exponential dichotomy, uniform
exponential dichotomy and strong exponential dichotomy) in the general context
of non-invertible evolution operators in Banach spaces. Connections between
these concepts are illustrated. Using the notion of Green function, we give
necessary conditions and sufficient ones for strong exponential dichotomy. Some
illustrative examples are presented to prove that the converse of some
implication type theorems are not valid
Effect of maleic anhydride–aniline derivative buffer layer on the properties of flexible substrate heterostructures: Indium tin oxide/nucleic acid base/metal
This paper presents some investigations on the properties of guanine (G) and cytosine (C) based heterostructures deposited on flexible substrates. The effects of two types of maleic anhydride–aniline derivatives (maleic anhydride-cyano aniline or maleic anhydride-2,4 dinitroaniline) buffer layer, deposited between indium tin oxide and (G) or (C) layer, on the optical and electrical properties of the heterostructures have been identified.The heterostructures containing a film of maleic anhydride-2,4 dinitroaniline have shown a good transparency and low photoluminescence in visible range. This buffer layer has determined an increase in the conductance only in the heterostructures based on (G) and (C) deposited on biaxially-oriented polyethylene terephthalate substrate
Choreographies in Practice
Choreographic Programming is a development methodology for concurrent
software that guarantees correctness by construction. The key to this paradigm
is to disallow mismatched I/O operations in programs, called choreographies,
and then mechanically synthesise distributed implementations in terms of
standard process models via a mechanism known as EndPoint Projection (EPP).
Despite the promise of choreographic programming, there is still a lack of
practical evaluations that illustrate the applicability of choreographies to
concrete computational problems with standard concurrent solutions. In this
work, we explore the potential of choreographies by using Procedural
Choreographies (PC), a model that we recently proposed, to write distributed
algorithms for sorting (Quicksort), solving linear equations (Gaussian
elimination), and computing Fast Fourier Transform. We discuss the lessons
learned from this experiment, giving possible directions for the usage and
future improvements of choreography languages
Effect of nano-patterning on the properties of the organic heterostructures prepared on Si substrate
The EuroHeart Failure survey programme—a survey on the quality of care among patients with heart failure in Europe Part 1: patient characteristics and diagnosis
The European Society of Cardiology (ESC) has published guidelines for the investigation of patients with suspected heart failure and, if the diagnosis is proven, their subsequent management. Hospitalisation provides a key point of care at which time diagnosis and treatment may be refined to improve outcome for a group of patients with a high morbidity and mortality. However, little international data exists to describe the features and management of such patients. Accordingly, the EuroHeart Failure survey was conducted to ascertain if appropriate tests were being performed with which to confirm or refute a diagnosis of heart failure and how this influenced subsequent management.
Methods The survey screened consecutive deaths and discharges during 2000-2001 predominantly from medical wards over a 6-week period in 115 hospitals from 24 countries belonging to the ESC, to identify patients with known or suspected heart failure.
Results A total of 46,788 deaths and discharges were screened from which 11,327 (24%) patients were enrolled with suspected or confirmed heart failure. Forty-seven percent of those enrolled were women. Fifty-one percent of women and 30% of men were aged >75 years. Eighty-three percent of patients had a diagnosis of heart failure made on or prior to the index admission. Heart failure was the principal reason for admission in 40%. The great majority of patients (>90%) had had an ECG, chest X-ray, haemoglobin and electrolytes measured as recommended in ESC guidelines, but only 66% had ever had an echocardiogram. Left ventricular ejection fraction had been measured in 57% of men and 41% of women, usually by echocardiography (84%) and was <40% in 51% of men but only in 28% of women. Forty-five percent of women and 22% of men were reported to have normal left ventricular systolic function by qualitative echocardiographic assessment. A substantial proportion of patients had alternative explanations for heart failure other than left ventricular systolic or diastolic dysfunction, including valve disease. Within 12 weeks of discharge, 24% of patients had been readmitted. A total of 1408 of 10,434 (13.5%) patients died between admission and 12 weeks follow-up.
Conclusions Known or suspected heart failure comprises a large proportion of admissions to medical wards and such patients are at high risk of early readmission and death. Many of the basic investigations recommended by the ESC were usually carried out, although it is not clear whether this was by design or part of a general routine for all patients being admitted regardless of diagnosis. The investigation most specific for patients with suspected heart failure (echocardiography) was performed less frequently, suggesting that the diagnosis of heart failure is still relatively neglected. Most men but a minority of women who underwent investigation of cardiac function had evidence of moderate or severe left ventricular dysfunction, the main target of current advances in the treatment of heart failure. Considerable diagnostic uncertainty remains for many patients with suspected heart failure, even after echocardiography, which must be resolved in order to target existing and new therapies and services effectively. (C) 2003 Published by Elsevier Science Ltd on behalf of The European Society of Cardiology
The EuroHeart Failure Survey programme—a survey on the quality of care among patients with heart failure in Europe Part 2: treatment
National surveys suggest that treatment of heart failure in daily practice differs from guidelines and is characterized by underuse of recommended medications. Accordingly, the Euro, Heart Failure Survey was conducted to ascertain how patients hospitalized for heart failure are managed in Europe and if national variations occur in the treatment of this condition.
Methods The survey screened discharge summaries of 11 304 patients over a 6-week period in 115 hospitals from 24 countries belonging to the ESC to study their medical treatment.
Results Diuretics (mainly loop diuretics) were prescribed in 86.9% followed by ACE inhibitors (61.8%), beta-blockers (36.9%), cardiac glycosides (35.7%), nitrates (32.1%), calcium. channel blockers (21.2%) and spironolactone (20.5%). 44.6% of the population used four or more different drugs. Only 17.2% were under the combination of diuretic, ACE inhibitors and beta-blockers. Important local variations were found in the rate of prescription of ACE inhibitors and particularly beta-blockers. Daily dosage of ACE inhibitors and particularly of beta-blockers was on average below the recommended target dose. Modelling-analysis of the prescription of treatments indicated that the aetiology of heart failure, age, co-morbid factors and type of hospital ward influenced the rate of prescription. Age 70 years, in patients with respiratory disease and increased in cardiology wards, in ischaemic heart failure and in mate subjects. Prescription of cardiac glycosides was significantly increased in patients with supraventricular tachycardia/atrial fibrillation. Finally, the rate of prescription of antithrombotic agents was increased in the presence of supraventricular arrhythmia, ischaemic heart disease, mate subjects but was decreased in patients over 70.
Conclusion Our results suggest that the prescription of recommended medications including ACE inhibitors and beta-blockers remains limited and that the daily dosage remains tow, particularly for beta-blockers. The survey also identifies several important factors including age, gender, type of hospital ward, co morbid factors which influence the prescription of heart failure medication at discharge
New Mexico Trauma System Funding Strategy
Trauma is the leading cause of death among individuals 1 to 44 years of age. Nationally, one individual dies of traumatic injuries every three minutes. In the United States, the financial impact of trauma is estimated to be approximately $671 billion annually, spent on direct trauma care and associated costs, such as loss in productive days and rehabilitation. The New Mexico trauma system registered dramatic development over the past 10 years. In 2007, the state had only three designated trauma centers, and today there are 12. However, over the same period, trauma system funding registered an equally dramatic decrease of approximately 70%. Having a functional trauma system in New Mexico is an absolute necessity. The purpose of this project was to identify potential sources of sustainable revenue for the New Mexico’s trauma system and to take the initial steps towards introducing legislation that will secure trauma system funding for the future. The work on this project resulted in initiating the first legislative step of this process
Data augmentation and transfer learning to classify malware images in a deep learning context
In the past few years, malware classification techniques have shifted from shallow traditional machine learning models to deeper neural network architectures. The main benefit of some of these is the ability to work with raw data, guaranteed by their automatic feature extraction capabilities. This results in less technical expertise needed while building the models, thus less initial pre-processing resources. Nevertheless, such advantage comes with its drawbacks, since deep learning models require huge quantities of data in order to generate a model that generalizes well. The amount of data required to train a deep network without overfitting is often unobtainable for malware analysts. We take inspiration from image-based data augmentation techniques and apply a sequence of semantics-preserving syntactic code transformations (obfuscations) to a small dataset of programs to generate a larger dataset. We then design two learning models, a convolutional neural network and a bi-directional long short-term memory, and we train them on images extracted from compiled binaries of the newly generated dataset. Through transfer learning we then take the features learned from the obfuscated binaries and train the models against two state of the art malware datasets, each containing around 10 000 samples. Our models easily achieve up to 98.5% accuracy on the test set, which is on par or better than the present state of the art approaches, thus validating the approach
- …
