1,924 research outputs found
Is attending a mental process?
The nature of attention has been the topic of a lively research programme in psychology for over a century. But there is widespread agreement that none of the theories on offer manage to fully capture the nature of attention. Recently, philosophers have become interested in the debate again after a prolonged period of neglect. This paper contributes to the project of explaining the nature of attention. It starts off by critically examining Christopher Mole’s prominent “adverbial” account of attention, which traces the failure of extant psychological theories to their assumption that attending is a kind of process. It then defends an alternative, process-based view of the metaphysics of attention, on which attention is understood as an activity and not, as psychologists seem to implicitly assume, an accomplishment. The entrenched distinction between accomplishments and activities is shown to shed new light on the metaphysics of attention. It also provides a novel diagnosis of the empirical state of play
Compressed sensing imaging techniques for radio interferometry
Radio interferometry probes astrophysical signals through incomplete and
noisy Fourier measurements. The theory of compressed sensing demonstrates that
such measurements may actually suffice for accurate reconstruction of sparse or
compressible signals. We propose new generic imaging techniques based on convex
optimization for global minimization problems defined in this context. The
versatility of the framework notably allows introduction of specific prior
information on the signals, which offers the possibility of significant
improvements of reconstruction relative to the standard local matching pursuit
algorithm CLEAN used in radio astronomy. We illustrate the potential of the
approach by studying reconstruction performances on simulations of two
different kinds of signals observed with very generic interferometric
configurations. The first kind is an intensity field of compact astrophysical
objects. The second kind is the imprint of cosmic strings in the temperature
field of the cosmic microwave background radiation, of particular interest for
cosmology.Comment: 10 pages, 1 figure. Version 2 matches version accepted for
publication in MNRAS. Changes includes: writing corrections, clarifications
of arguments, figure update, and a new subsection 4.1 commenting on the exact
compliance of radio interferometric measurements with compressed sensin
The 74MHz System on the Very Large Array
The Naval Research Laboratory and the National Radio Astronomy Observatory
completed implementation of a low frequency capability on the VLA at 73.8 MHz
in 1998. This frequency band offers unprecedented sensitivity (~25 mJy/beam)
and resolution (~25 arcsec) for low-frequency observations. We review the
hardware, the calibration and imaging strategies, comparing them to those at
higher frequencies, including aspects of interference excision and wide-field
imaging. Ionospheric phase fluctuations pose the major difficulty in
calibrating the array. Over restricted fields of view or at times of extremely
quiescent ionospheric ``weather'', an angle-invariant calibration strategy can
be used. In this approach a single phase correction is devised for each
antenna, typically via self-calibration. Over larger fields of view or at times
of more normal ionospheric ``weather'' when the ionospheric isoplanatic patch
size is smaller than the field of view, we adopt a field-based strategy in
which the phase correction depends upon location within the field of view. This
second calibration strategy was implemented by modeling the ionosphere above
the array using Zernike polynomials. Images of 3C sources of moderate strength
are provided as examples of routine, angle-invariant calibration and imaging.
Flux density measurements indicate that the 74 MHz flux scale at the VLA is
stable to a few percent, and tied to the Baars et al. value of Cygnus A at the
5 percent level. We also present an example of a wide-field image, devoid of
bright objects and containing hundreds of weaker sources, constructed from the
field-based calibration. We close with a summary of lessons the 74 MHz system
offers as a model for new and developing low-frequency telescopes. (Abridged)Comment: 73 pages, 46 jpeg figures, to appear in ApJ
A Brief History of AGN
Astronomers knew early in the twentieth century that some galaxies have
emission-line nuclei. However, even the systematic study by Seyfert (1943) was
not enough to launch active galactic nuclei (AGN) as a major topic of
astronomy. The advances in radio astronomy in the 1950s revealed a new universe
of energetic phenomena, and inevitably led to the discovery of quasars. These
discoveries demanded the attention of observers and theorists, and AGN have
been a subject of intense effort ever since. Only a year after the recognition
of the redshifts of 3C 273 and 3C 48 in 1963, the idea of energy production by
accretion onto a black hole was advanced. However, acceptance of this idea came
slowly, encouraged by the discovery of black hole X-ray sources in our Galaxy
and, more recently, supermassive black holes in the center of the Milky Way and
other galaxies. Many questions remain as to the formation and fueling of the
hole, the geometry of the central regions, the detailed emission mechanisms,
the production of jets, and other aspects. The study of AGN will remain a
vigorous part of astronomy for the foreseeable future.Comment: 37 pages, no figures. Uses aaspp4.sty. To be published in
Publications of the Astronomical Society of the Pacific, 1999 Jun
Judging the impact of leadership-development activities on school practice
The nature and effectiveness of professional-development activities should be judged in a way that takes account of
both the achievement of intended outcomes and the unintended consequences that may result. Our research project set out to create a robust approach that school staff members could use to assess the impact of
professional-development programs on leadership and management practice without being constrained in this judgment by the stated aims of the program. In the process,
we identified a number of factors and requirements relevant to a wider audience than that concerned with the development of leadership and management in England.
Such an assessment has to rest upon a clear understanding of educational leadership,a clearly articulated model of practice, and a clear model of potential forms of impact.
Such foundations, suitably adapted to the subject being addressed, are appropriate for assessing all teacher professional development
Recommended from our members
Managing digital coordination of design: emerging hybrid practices in an institutionalized project setting
What happens when digital coordination practices are introduced into the institutionalized setting of an engineering project? This question is addressed through an interpretive study that examines how a shared digital model becomes used in the late design stages of a major station refurbishment project. The paper contributes by mobilizing the idea of ‘hybrid practices’ to understand the diverse patterns of activity that emerge to manage digital coordination of design. It articulates how engineering and architecture professions develop different relationships with the shared model; the design team negotiates paper-based practices across organizational boundaries; and diverse practitioners probe the potential and limitations of the digital infrastructure. While different software packages and tools have become linked together into an integrated digital infrastructure, these emerging hybrid practices contrast with the interactions anticipated in practice and policy guidance and presenting new opportunities and challenges for managing project delivery. The study has implications for researchers working in the growing field of empirical work on engineering project organizations as it shows the importance of considering, and suggests new ways to theorise, the introduction of digital coordination practices into these institutionalized settings
Models and metaphors: complexity theory and through-life management in the built environment
Complexity thinking may have both modelling and metaphorical applications in the through-life management of the built environment. These two distinct approaches are examined and compared. In the first instance, some of the sources of complexity in the design, construction and maintenance of the built environment are identified. The metaphorical use of complexity in management thinking and its application in the built environment are briefly examined. This is followed by an exploration of modelling techniques relevant to built environment concerns. Non-linear and complex mathematical techniques such as fuzzy logic, cellular automata and attractors, may be applicable to their analysis. Existing software tools are identified and examples of successful built environment applications of complexity modelling are given. Some issues that arise include the definition of phenomena in a mathematically usable way, the functionality of available software and the possibility of going beyond representational modelling. Further questions arising from the application of complexity thinking are discussed, including the possibilities for confusion that arise from the use of metaphor. The metaphor of a 'commentary machine' is suggested as a possible way forward and it is suggested that an appropriate linguistic analysis can in certain situations reduce perceived complexity
Beyond persons: extending the personal / subpersonal distinction to non-rational animals and artificial agents
The distinction between personal level explanations and subpersonal ones has been subject to much debate in philosophy. We understand it as one between explanations that focus on an agent’s interaction with its environment, and explanations that focus on the physical or computational enabling conditions of such an interaction. The distinction, understood this way, is necessary for a complete account of any agent, rational or not, biological or artificial. In particular, we review some recent research in Artificial Life that pretends to do completely without the distinction, while using agent-centered concepts all the way. It is argued that the rejection of agent level explanations in favour of mechanistic ones is due to an unmotivated need to choose among representationalism and eliminativism. The dilemma is a false one if the possibility of a radical form of externalism is considered
The WEBT Campaign on the Blazar 3C279 in 2006
The quasar 3C279 was the target of an extensive multiwavelength monitoring
campaign from January through April 2006, including an optical-IR-radio
monitoring campaign by the Whole Earth Blazar Telescope (WEBT) collaboration.
In this paper we focus on the results of the WEBT campaign. The source
exhibited substantial variability of optical flux and spectral shape, with a
characteristic time scale of a few days. The variability patterns throughout
the optical BVRI bands were very closely correlated with each other. In
intriguing contrast to other (in particular, BL Lac type) blazars, we find a
lag of shorter- behind longer-wavelength variability throughout the RVB ranges,
with a time delay increasing with increasing frequency. Spectral hardening
during flares appears delayed with respect to a rising optical flux. This, in
combination with the very steep IR-optical continuum spectral index of ~ 1.5 -
2.0, may indicate a highly oblique magnetic field configuration near the base
of the jet. An alternative explanation through a slow (time scale of several
days) acceleration mechanism would require an unusually low magnetic field of <
0.2 G, about an order of magnitude lower than inferred from previous analyses
of simultaneous SEDs of 3C279 and other FSRQs with similar properties.Comment: Accepted for publication in Ap
Disk-Jet Connection in the Radio Galaxy 3C 120
We present the results of extensive multi-frequency monitoring of the radio
galaxy 3C 120 between 2002 and 2007 at X-ray, optical, and radio wave bands, as
well as imaging with the Very Long Baseline Array (VLBA). Over the 5 yr of
observation, significant dips in the X-ray light curve are followed by
ejections of bright superluminal knots in the VLBA images. Consistent with
this, the X-ray flux and 37 GHz flux are anti-correlated with X-ray leading the
radio variations. This implies that, in this radio galaxy, the radiative state
of accretion disk plus corona system, where the X-rays are produced, has a
direct effect on the events in the jet, where the radio emission originates.
The X-ray power spectral density of 3C 120 shows a break, with steeper slope at
shorter timescale and the break timescale is commensurate with the mass of the
central black hole based on observations of Seyfert galaxies and black hole
X-ray binaries. These findings provide support for the paradigm that black hole
X-ray binaries and active galactic nuclei are fundamentally similar systems,
with characteristic time and size scales linearly proportional to the mass of
the central black hole. The X-ray and optical variations are strongly
correlated in 3C 120, which implies that the optical emission in this object
arises from the same general region as the X-rays, i.e., in the accretion
disk-corona system. We numerically model multi-wavelength light curves of 3C
120 from such a system with the optical-UV emission produced in the disk and
the X-rays generated by scattering of thermal photons by hot electrons in the
corona. From the comparison of the temporal properties of the model light
curves to that of the observed variability, we constrain the physical size of
the corona and the distances of the emitting regions from the central BH.Comment: Accepted for publication in the Astrophysical Journal. 28 pages, 21
figures, 2 table
- …
