24,450 research outputs found
Quasi-Periodic Oscillations in magnetars: linking variability and emission
I present recent results studying flare emission in magnetars. Strong
quasi-periodic oscillations observed in the tail of giant magnetar flares are
frequently interpreted as evidence for global seismic oscillations. I
demonstrate that such a global oscillation is not directly observable in the
lightcurve. New work suggests the amplitude for the strongest QPO stays nearly
constant in the rotation phases where it is observed, which I argue suggests it
is produced by an additional emission process from the star.Comment: Proceedings of IAUS 291 "Neutron Stars and Pulsars: Challenges and
Opportunities after 80 years", J. van Leeuwen (ed.); 4 pages, 3 figure
Low Energy Neutrino Measurements
Low Energy solar neutrino detection plays a fundamental role in understanding
both solar astrophysics and particle physics. After introducing the open
questions on both fields, we review here the major results of the last two
years and expectations for the near future from Borexino, Super-Kamiokande, SNO
and KamLAND experiments as well as from upcoming (SNO+) and planned (LENA)
experiments. Scintillator neutrino detectors are also powerful antineutrino
detectors such as those emitted by the Earth crust and mantle. First
measurements of geo-neutrinos have occurred and can bring fundamental
contribution in understanding the geophysics of the planet.Comment: 18 pages, 36 figures, proceedings of XXV Lepton Photon, 22 to 27
August 2011, published on 2012-10-0
Parallel and Distributed Simulation from Many Cores to the Public Cloud (Extended Version)
In this tutorial paper, we will firstly review some basic simulation concepts
and then introduce the parallel and distributed simulation techniques in view
of some new challenges of today and tomorrow. More in particular, in the last
years there has been a wide diffusion of many cores architectures and we can
expect this trend to continue. On the other hand, the success of cloud
computing is strongly promoting the everything as a service paradigm. Is
parallel and distributed simulation ready for these new challenges? The current
approaches present many limitations in terms of usability and adaptivity: there
is a strong need for new evaluation metrics and for revising the currently
implemented mechanisms. In the last part of the paper, we propose a new
approach based on multi-agent systems for the simulation of complex systems. It
is possible to implement advanced techniques such as the migration of simulated
entities in order to build mechanisms that are both adaptive and very easy to
use. Adaptive mechanisms are able to significantly reduce the communication
cost in the parallel/distributed architectures, to implement load-balance
techniques and to cope with execution environments that are both variable and
dynamic. Finally, such mechanisms will be used to build simulations on top of
unreliable cloud services.Comment: Tutorial paper published in the Proceedings of the International
Conference on High Performance Computing and Simulation (HPCS 2011). Istanbul
(Turkey), IEEE, July 2011. ISBN 978-1-61284-382-
A Study of Perennial Philosophy and Psychedelic Experience, with a Proposal to Revise W. T. Stace’s Core Characteristics of Mystical Experience
A Study of Perennial Philosophy and Psychedelic Experience, with a Proposal to Revise W. T. Stace’s Core Characteristics of Mystical Experience
©Ed D’Angelo 2018
Abstract
According to the prevailing paradigm in psychedelic research today, when used within an appropriate set and setting, psychedelics can reliably produce an authentic mystical experience. According to the prevailing paradigm, an authentic mystical experience is one that possesses the common or universal characteristics of mystical experience as identified by the philosopher W. T. Stace in his 1960 work Mysticism and Philosophy. Stace’s common characteristics of mystical experience are the basis for the Hood Mysticism Questionnaire, which is the most widely used quantitative measure of mystical experience in experimental studies of psychedelic experience. In this paper, I trace the historical roots of Stace’s common characteristics of mystical experience back to Christian Neoplatonism and apophatic theology, and I trace those, in turn, back to Plato’s concept of the Good and to Aristotle’s concept of God as active intellect. I argue that Stace’s common characteristics of mystical experience are not universal or culturally invariant but are the product of a specifically Christian religious and moral tradition that has its roots in ancient Greek metaphysics. My paper concludes with a revised list of common characteristics of psychedelic experience that is a better candidate for a list of invariant structures of psychedelic experience than Stace’s common characteristics of Christian mystical experience
Highly intensive data dissemination in complex networks
This paper presents a study on data dissemination in unstructured
Peer-to-Peer (P2P) network overlays. The absence of a structure in unstructured
overlays eases the network management, at the cost of non-optimal mechanisms to
spread messages in the network. Thus, dissemination schemes must be employed
that allow covering a large portion of the network with a high probability
(e.g.~gossip based approaches). We identify principal metrics, provide a
theoretical model and perform the assessment evaluation using a high
performance simulator that is based on a parallel and distributed architecture.
A main point of this study is that our simulation model considers
implementation technical details, such as the use of caching and Time To Live
(TTL) in message dissemination, that are usually neglected in simulations, due
to the additional overhead they cause. Outcomes confirm that these technical
details have an important influence on the performance of dissemination schemes
and that the studied schemes are quite effective to spread information in P2P
overlay networks, whatever their topology. Moreover, the practical usage of
such dissemination mechanisms requires a fine tuning of many parameters, the
choice between different network topologies and the assessment of behaviors
such as free riding. All this can be done only using efficient simulation tools
to support both the network design phase and, in some cases, at runtime
Shape-based defect classification for Non Destructive Testing
The aim of this work is to classify the aerospace structure defects detected
by eddy current non-destructive testing. The proposed method is based on the
assumption that the defect is bound to the reaction of the probe coil impedance
during the test. Impedance plane analysis is used to extract a feature vector
from the shape of the coil impedance in the complex plane, through the use of
some geometric parameters. Shape recognition is tested with three different
machine-learning based classifiers: decision trees, neural networks and Naive
Bayes. The performance of the proposed detection system are measured in terms
of accuracy, sensitivity, specificity, precision and Matthews correlation
coefficient. Several experiments are performed on dataset of eddy current
signal samples for aircraft structures. The obtained results demonstrate the
usefulness of our approach and the competiveness against existing descriptors.Comment: 5 pages, IEEE International Worksho
Parallel Sort-Based Matching for Data Distribution Management on Shared-Memory Multiprocessors
In this paper we consider the problem of identifying intersections between
two sets of d-dimensional axis-parallel rectangles. This is a common problem
that arises in many agent-based simulation studies, and is of central
importance in the context of High Level Architecture (HLA), where it is at the
core of the Data Distribution Management (DDM) service. Several realizations of
the DDM service have been proposed; however, many of them are either
inefficient or inherently sequential. These are serious limitations since
multicore processors are now ubiquitous, and DDM algorithms -- being
CPU-intensive -- could benefit from additional computing power. We propose a
parallel version of the Sort-Based Matching algorithm for shared-memory
multiprocessors. Sort-Based Matching is one of the most efficient serial
algorithms for the DDM problem, but is quite difficult to parallelize due to
data dependencies. We describe the algorithm and compute its asymptotic running
time; we complete the analysis by assessing its performance and scalability
through extensive experiments on two commodity multicore systems based on a
dual socket Intel Xeon processor, and a single socket Intel Core i7 processor.Comment: Proceedings of the 21-th ACM/IEEE International Symposium on
Distributed Simulation and Real Time Applications (DS-RT 2017). Best Paper
Award @DS-RT 201
LUNES: Agent-based Simulation of P2P Systems (Extended Version)
We present LUNES, an agent-based Large Unstructured NEtwork Simulator, which
allows to simulate complex networks composed of a high number of nodes. LUNES
is modular, since it splits the three phases of network topology creation,
protocol simulation and performance evaluation. This permits to easily
integrate external software tools into the main software architecture. The
simulation of the interaction protocols among network nodes is performed via a
simulation middleware that supports both the sequential and the
parallel/distributed simulation approaches. In the latter case, a specific
mechanism for the communication overhead-reduction is used; this guarantees
high levels of performance and scalability. To demonstrate the efficiency of
LUNES, we test the simulator with gossip protocols executed on top of networks
(representing peer-to-peer overlays), generated with different topologies.
Results demonstrate the effectiveness of the proposed approach.Comment: Proceedings of the International Workshop on Modeling and Simulation
of Peer-to-Peer Architectures and Systems (MOSPAS 2011). As part of the 2011
International Conference on High Performance Computing and Simulation (HPCS
2011
- …
