4,075 research outputs found

    A geometric comparison of aerofoil shape parameterisation methods

    Get PDF

    Using coloured filters to reduce the symptoms of visual stress in children with reading delay

    Get PDF
    Background: Meares Irlen Syndrome (MIS), otherwise known as “visual stress”, is one condition that can cause difficulties with reading. Aim: This study aimed to compare the effect of two coloured-filter systems on the symptoms of visual stress in children with reading delay. Methods: The study design was a pre-test, post-test, randomized head-to-head comparison of two filter systems on the symptoms of visual stress in school children. A total of 68 UK mainstream schoolchildren with significant impairment in reading ability completed the study. Results: The filter systems appeared to have a large effect on the reported symptoms between pre and post three-month time points (d = 2.5, r = 0.78). Both filter types appeared to have large effects (Harris d = 1.79, r = 0.69 and DRT d = 3.22, r = 0.85). Importantly, 35% of participants’ reported that their symptoms had resolved completely; 72% of the 68 children appeared to gain improvements in three or more visual stress symptoms. Conclusion and significance: The reduction in symptoms, which appeared to be brought about by the use of coloured filters, eased the visual discomfort experienced by these children when reading. This type of intervention therefore has the potential to facilitate occupational engagement

    A timeband framework for modelling real-time systems

    Get PDF
    Complex real-time systems must integrate physical processes with digital control, human operation and organisational structures. New scientific foundations are required for specifying, designing and implementing these systems. One key challenge is to cope with the wide range of time scales and dynamics inherent in such systems. To exploit the unique properties of time, with the aim of producing more dependable computer-based systems, it is desirable to explicitly identify distinct time bands in which the system is situated. Such a framework enables the temporal properties and associated dynamic behaviour of existing systems to be described and the requirements for new or modified systems to be specified. A system model based on a finite set of distinct time bands is motivated and developed in this paper

    Manipulating infrared photons using plasmons in transparent graphene superlattices

    Full text link
    Superlattices are artificial periodic nanostructures which can control the flow of electrons. Their operation typically relies on the periodic modulation of the electric potential in the direction of electron wave propagation. Here we demonstrate transparent graphene superlattices which can manipulate infrared photons utilizing the collective oscillations of carriers, i.e., plasmons of the ensemble of multiple graphene layers. The superlattice is formed by depositing alternating wafer-scale graphene sheets and thin insulating layers, followed by patterning them all together into 3-dimensional photonic-crystal-like structures. We demonstrate experimentally that the collective oscillation of Dirac fermions in such graphene superlattices is unambiguously nonclassical: compared to doping single layer graphene, distributing carriers into multiple graphene layers strongly enhances the plasmonic resonance frequency and magnitude, which is fundamentally different from that in a conventional semiconductor superlattice. This property allows us to construct widely tunable far-infrared notch filters with 8.2 dB rejection ratio and terahertz linear polarizers with 9.5 dB extinction ratio, using a superlattice with merely five graphene atomic layers. Moreover, an unpatterned superlattice shields up to 97.5% of the electromagnetic radiations below 1.2 terahertz. This demonstration also opens an avenue for the realization of other transparent mid- and far-infrared photonic devices such as detectors, modulators, and 3-dimensional meta-material systems.Comment: under revie

    Long term time variability of cosmic rays and possible relevance to the development of life on Earth

    Full text link
    An analysis is made of the manner in which the cosmic ray intensity at Earth has varied over its existence and its possible relevance to both the origin and the evolution of life. Much of the analysis relates to the 'high energy' cosmic rays (E>1014eV;=0.1PeVE>10^{14}eV;=0.1PeV) and their variability due to the changing proximity of the solar system to supernova remnants which are generally believed to be responsible for most cosmic rays up to PeV energies. It is pointed out that, on a statistical basis, there will have been considerable variations in the likely 100 My between the Earth's biosphere reaching reasonable stability and the onset of very elementary life. Interestingly, there is the increasingly strong possibility that PeV cosmic rays are responsible for the initiation of terrestrial lightning strokes and the possibility arises of considerable increases in the frequency of lightnings and thereby the formation of some of the complex molecules which are the 'building blocks of life'. Attention is also given to the well known generation of the oxides of nitrogen by lightning strokes which are poisonous to animal life but helpful to plant growth; here, too, the violent swings of cosmic ray intensities may have had relevance to evolutionary changes. A particular variant of the cosmic ray acceleration model, put forward by us, predicts an increase in lightning rate in the past and this has been sought in Korean historical records. Finally, the time dependence of the overall cosmic ray intensity, which manifests itself mainly at sub-10 GeV energies, has been examined. The relevance of cosmic rays to the 'global electrical circuit' points to the importance of this concept.Comment: 18 pages, 5 figures, accepted by 'Surveys in Geophysics

    Determination of optimal drug dose and light dose index to achieve minimally invasive focal ablation of localised prostate cancer using WST11-vascular-targeted photodynamic (VTP) therapy

    Get PDF
    Objective: To determine the optimal drug and light dose for prostate ablation using WST11 (TOOKAD® Soluble) for vascular-targeted photodynamic (VTP) therapy in men with low-risk prostate cancer. Patients and Methods: In all, 42 men with low-risk prostate cancer were enrolled in the study but two who underwent anaesthesia for the procedure did not receive the drug or light dose. Thus, 40 men received a single dose of 2, 4 or 6 mg/kg WST11 activated by 200 J/cm light at 753 nm. WST11 was given as a 10-min intravenous infusion. The light dose was delivered using cylindrical diffusing fibres within hollow plastic needles positioned in the prostate using transrectal ultrasonography (TRUS) guidance and a brachytherapy template. Magnetic resonance imaging (MRI) was used to assess treatment effect at 7 days, with assessment of urinary function (International Prostate Symptom Score [IPSS]), sexual function (International Index of Erectile Function [IIEF]) and adverse events at 7 days, 1, 3 and 6 months after VTP. TRUS-guided biopsies were taken at 6 months. Results: In all, 39 of the 40 treated men completed the follow-up. The Day-7 MRI showed maximal treatment effect (95% of the planned treatment volume) in men who had a WST11 dose of 4 mg/kg, light dose of 200 J/cm and light density index (LDI) of >1. In the 12 men treated with these parameters, the negative biopsy rate was 10/12 (83%) at 6 months, compared with 10/26 (45%) for the men who had either a different drug dose (10 men) or an LDI of <1 (16). Transient urinary symptoms were seen in most of the men, with no significant difference in IPSS score between baseline and 6 months after VTP. IIEF scores were not significantly different between baseline and 6 months after VTP. Conclusion: Treatment with 4 mg/kg TOOKAD Soluble activated by 753 nm light at a dose of 200 J/cm and an LDI of >1 resulted in treatment effect in 95% of the planned treatment volume and a negative biopsy rate at 6 months of 10/12 men (83%)

    A tri-dimensional approach for auditing brand loyalty

    Get PDF
    Over the past twenty years brand loyalty has been an important topic for both marketing practitioners and academics. While practitioners have produced proprietary brand loyalty audit models, there has been little academic research to make transparent the methodology that underpins these audits and to enable practitioners to understand, develop and conduct their own audits. In this paper, we propose a framework for a brand loyalty audit that uses a tri-dimensional approach to brand loyalty, which includes behavioural loyalty and the two components of attitudinal loyalty: emotional and cognitive loyalty. In allowing for different levels and intensity of brand loyalty, this tri-dimensional approach is important from a managerial perspective. It means that loyalty strategies that arise from a brand audit can be made more effective by targeting the market segments that demonstrate the most appropriate combination of brand loyalty components. We propose a matrix with three dimensions (emotional, cognitive and behavioural loyalty) and two levels (high and low loyalty) to facilitate a brand loyalty audit. To demonstrate this matrix, we use the example of financial services, in particular a rewards-based credit card

    Bell Correlations and the Common Future

    Full text link
    Reichenbach's principle states that in a causal structure, correlations of classical information can stem from a common cause in the common past or a direct influence from one of the events in correlation to the other. The difficulty of explaining Bell correlations through a mechanism in that spirit can be read as questioning either the principle or even its basis: causality. In the former case, the principle can be replaced by its quantum version, accepting as a common cause an entangled state, leaving the phenomenon as mysterious as ever on the classical level (on which, after all, it occurs). If, more radically, the causal structure is questioned in principle, closed space-time curves may become possible that, as is argued in the present note, can give rise to non-local correlations if to-be-correlated pieces of classical information meet in the common future --- which they need to if the correlation is to be detected in the first place. The result is a view resembling Brassard and Raymond-Robichaud's parallel-lives variant of Hermann's and Everett's relative-state formalism, avoiding "multiple realities."Comment: 8 pages, 5 figure

    Control of hyperglycaemia in paediatric intensive care (CHiP): study protocol.

    Get PDF
    BACKGROUND: There is increasing evidence that tight blood glucose (BG) control improves outcomes in critically ill adults. Children show similar hyperglycaemic responses to surgery or critical illness. However it is not known whether tight control will benefit children given maturational differences and different disease spectrum. METHODS/DESIGN: The study is an randomised open trial with two parallel groups to assess whether, for children undergoing intensive care in the UK aged <or= 16 years who are ventilated, have an arterial line in-situ and are receiving vasoactive support following injury, major surgery or in association with critical illness in whom it is anticipated such treatment will be required to continue for at least 12 hours, tight control will increase the numbers of days alive and free of mechanical ventilation at 30 days, and lead to improvement in a range of complications associated with intensive care treatment and be cost effective. Children in the tight control group will receive insulin by intravenous infusion titrated to maintain BG between 4 and 7.0 mmol/l. Children in the control group will be treated according to a standard current approach to BG management. Children will be followed up to determine vital status and healthcare resources usage between discharge and 12 months post-randomisation. Information regarding overall health status, global neurological outcome, attention and behavioural status will be sought from a subgroup with traumatic brain injury (TBI). A difference of 2 days in the number of ventilator-free days within the first 30 days post-randomisation is considered clinically important. Conservatively assuming a standard deviation of a week across both trial arms, a type I error of 1% (2-sided test), and allowing for non-compliance, a total sample size of 1000 patients would have 90% power to detect this difference. To detect effect differences between cardiac and non-cardiac patients, a target sample size of 1500 is required. An economic evaluation will assess whether the costs of achieving tight BG control are justified by subsequent reductions in hospitalisation costs. DISCUSSION: The relevance of tight glycaemic control in this population needs to be assessed formally before being accepted into standard practice

    The Road to Quantum Computational Supremacy

    Full text link
    We present an idiosyncratic view of the race for quantum computational supremacy. Google's approach and IBM challenge are examined. An unexpected side-effect of the race is the significant progress in designing fast classical algorithms. Quantum supremacy, if achieved, won't make classical computing obsolete.Comment: 15 pages, 1 figur
    corecore