1,563 research outputs found
A GPU-based survey for millisecond radio transients using ARTEMIS
Astrophysical radio transients are excellent probes of extreme physical
processes originating from compact sources within our Galaxy and beyond. Radio
frequency signals emitted from these objects provide a means to study the
intervening medium through which they travel. Next generation radio telescopes
are designed to explore the vast unexplored parameter space of high time
resolution astronomy, but require High Performance Computing (HPC) solutions to
process the enormous volumes of data that are produced by these telescopes. We
have developed a combined software /hardware solution (code named ARTEMIS) for
real-time searches for millisecond radio transients, which uses GPU technology
to remove interstellar dispersion and detect millisecond radio bursts from
astronomical sources in real-time. Here we present an introduction to ARTEMIS.
We give a brief overview of the software pipeline, then focus specifically on
the intricacies of performing incoherent de-dispersion. We present results from
two brute-force algorithms. The first is a GPU based algorithm, designed to
exploit the L1 cache of the NVIDIA Fermi GPU. Our second algorithm is CPU based
and exploits the new AVX units in Intel Sandy Bridge CPUs.Comment: 4 pages, 7 figures. To appear in the proceedings of ADASS XXI, ed.
P.Ballester and D.Egret, ASP Conf. Se
Quantifying Information Leaks Using Reliability Analysis
acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4acmid: 2632367 keywords: Model Counting, Quantitative Information Flow, Reliability Analysis, Symbolic Execution location: San Jose, CA, USA numpages: 4We report on our work-in-progress into the use of reliability analysis to quantify information leaks. In recent work we have proposed a software reliability analysis technique that uses symbolic execution and model counting to quantify the probability of reaching designated program states, e.g. assert violations, under uncertainty conditions in the environment. The technique has many applications beyond reliability analysis, ranging from program understanding and debugging to analysis of cyber-physical systems. In this paper we report on a novel application of the technique, namely Quantitative Information Flow analysis (QIF). The goal of QIF is to measure information leakage of a program by using information-theoretic metrics such as Shannon entropy or Renyi entropy. We exploit the model counting engine of the reliability analyzer over symbolic program paths, to compute an upper bound of the maximum leakage over all possible distributions of the confidential data. We have implemented our approach into a prototype tool, called QILURA, and explore its effectiveness on a number of case studie
Correlating low energy impact damage with changes in modal parameters: diagnosis tools and FE validation
This paper presents a basic experimental technique and simplified FE based models for the detection, localization and quantification of impact damage in composite beams around the BVID level. Detection of damage is carried out by shift in modal parameters. Localization of damage is done by a topology optimization tool which showed that correct damage locations can be found rather efficiently for low-level damage. The novelty of this paper is that we develop an All In One (AIO) package dedicated to impact identification by modal analysis. The damaged zones in the FE models are updated by reducing the most sensitive material property in order to improve the experimental/numerical correlation of the frequency
response functions. These approximate damage models(in term of equivalent rigidity) give us a simple degradation factor that can serve as a warning regarding structure safety
ScotGrid: Providing an Effective Distributed Tier-2 in the LHC Era
ScotGrid is a distributed Tier-2 centre in the UK with sites in Durham,
Edinburgh and Glasgow. ScotGrid has undergone a huge expansion in hardware in
anticipation of the LHC and now provides more than 4MSI2K and 500TB to the LHC
VOs. Scaling up to this level of provision has brought many challenges to the
Tier-2 and we show in this paper how we have adopted new methods of organising
the centres, from fabric management and monitoring to remote management of
sites to management and operational procedures, to meet these challenges. We
describe how we have coped with different operational models at the sites,
where Glagsow and Durham sites are managed "in house" but resources at
Edinburgh are managed as a central university resource. This required the
adoption of a different fabric management model at Edinburgh and a special
engagement with the cluster managers. Challenges arose from the different job
models of local and grid submission that required special attention to resolve.
We show how ScotGrid has successfully provided an infrastructure for ATLAS and
LHCb Monte Carlo production. Special attention has been paid to ensuring that
user analysis functions efficiently, which has required optimisation of local
storage and networking to cope with the demands of user analysis. Finally,
although these Tier-2 resources are pledged to the whole VO, we have
established close links with our local physics user communities as being the
best way to ensure that the Tier-2 functions effectively as a part of the LHC
grid computing framework..Comment: Preprint for 17th International Conference on Computing in High
Energy and Nuclear Physics, 7 pages, 1 figur
A Toy Model for Testing Finite Element Methods to Simulate Extreme-Mass-Ratio Binary Systems
Extreme mass ratio binary systems, binaries involving stellar mass objects
orbiting massive black holes, are considered to be a primary source of
gravitational radiation to be detected by the space-based interferometer LISA.
The numerical modelling of these binary systems is extremely challenging
because the scales involved expand over several orders of magnitude. One needs
to handle large wavelength scales comparable to the size of the massive black
hole and, at the same time, to resolve the scales in the vicinity of the small
companion where radiation reaction effects play a crucial role. Adaptive finite
element methods, in which quantitative control of errors is achieved
automatically by finite element mesh adaptivity based on posteriori error
estimation, are a natural choice that has great potential for achieving the
high level of adaptivity required in these simulations. To demonstrate this, we
present the results of simulations of a toy model, consisting of a point-like
source orbiting a black hole under the action of a scalar gravitational field.Comment: 29 pages, 37 figures. RevTeX 4.0. Minor changes to match the
published versio
Reduced-Order Modeling of Turbulent Reacting Flows with Application to Ramjets and Scramjets
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/90621/1/AIAA-50272-117.pd
Recommended from our members
Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations
Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties – from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects
Pushouts in software architecture design
A classical approach to program derivation is to progressively extend a simple specification and then incrementally refine it to an implementation. We claim this approach is hard or impractical when reverse engineering legacy software architectures. We present a case study that shows optimizations and pushouts--in addition to refinements and extensions--are essential for practical stepwise development of complex software architectures.NSF CCF 0724979NSF CNS 0509338NSF CCF 0917167NSF DGE-1110007FCT SFRH/BD/47800/2008FCT UTAustin/CA/0056/200
General practitioners’ perspectives on campaigns to promote rapid help-seeking behaviour at the onset of rheumatoid arthritis
Objective. To explore general practitioners’ (GPs’ ) perspectives on public health campaigns to encourage people with the early symptoms of rheumatoid arthritis (RA) to seek medical help rapidly. Design. Nineteen GPs participated in four semistructured focus groups. Focus groups were audio-recorded, transcribed verbatim, and analysed using thematic analysis. Results. GPs recognised the need for the early treatment of RA and identified that facilitating appropriate access to care was important. However, not all held the view that a delay in help seeking was a clinically significant issue. Furthermore, many were concerned that the early symptoms of RA were often non-specific, and that current knowledge about the nature of symptoms at disease onset was inadequate to inform the content of a help-seeking campaign. They argued that a campaign might not be able to specifically target those who need to present urgently. Poorly designed campaigns were suggested to have a negative impact on GPs’ workloads, and would “clog up” the referral pathway for genuine cases of RA. Conclusions. GPs were supportive of strategies to improve access to Rheumatological care and increase public awareness of RA symptoms. However, they have identified important issues that need to be considered in developing a public health campaign that forms part of an overall strategy to reduce time to treatment for patients with new onset RA. This study highlights the value of gaining GPs’ perspectives before launching health promotion campaigns
Precision measurements of the top quark mass from the Tevatron in the pre-LHC era
The top quark is the heaviest of the six quarks of the Standard Model.
Precise knowledge of its mass is important for imposing constraints on a number
of physics processes, including interactions of the as yet unobserved Higgs
boson. The Higgs boson is the only missing particle of the Standard Model,
central to the electroweak symmetry breaking mechanism and generation of
particle masses. In this Review, experimental measurements of the top quark
mass accomplished at the Tevatron, a proton-antiproton collider located at the
Fermi National Accelerator Laboratory, are described. Topologies of top quark
events and methods used to separate signal events from background sources are
discussed. Data analysis techniques used to extract information about the top
mass value are reviewed. The combination of several most precise measurements
performed with the two Tevatron particle detectors, CDF and \D0, yields a value
of \Mt = 173.2 \pm 0.9 GeV/.Comment: This version contains the most up-to-date top quark mass averag
- …
