831 research outputs found
Mass of the black hole in the Seyfert 1.5 galaxy H 0507+164 from reverberation mapping
We present the results of our optical monitoring campaign of the X-ray source
H 0507+164, a low luminosity Seyfert 1.5 galaxy at a redshift z = 0.018.
Spectroscopic observations were carried out during 22 nights in 2007, from the
21 of November to the 26 of December. Photometric observations in the R-band
for 13 nights were also obtained during the same period. The continuum and
broad line fluxes of the galaxy were found to vary during our monitoring
period. The R-band differential light curve with respect to a companion star
also shows a similar variability. Using cross correlation analysis, we
estimated a time delay of 3.01 days (in the rest frame), of the response of the
broad H-beta line fluxes to the variations in the optical continuum at 5100
angstroms. Using this time delay and the width of the H-beta line, we estimated
the radius for the Broad Line Region (BLR) of 2.53 x 10^{-3} parsec, and a
black hole mass of 9.62 x 10^{6} solar mass.Comment: 7 pages, 8 figures, Accepted for publication in MNRA
Socio-economic Impact Assessment of Livelihood Security in Agriculture, Animal Husbandry and Aquaculture on the Tsunami-hit Lands of Andaman
Indian subcontinent is highly vulnerable to major natural disasters such as earthquakes, cyclones, floods, droughts, landslides and bushfires. Tsunami, which is a recent addition to this list, had occurred in the early morning of 26th December 2004, after a massive earthquake of 9.2 magnitude on the Richter scale in Andaman & Nicobar islands, resulting in the submergence of large area of farmland, and subsequent drying up of water bodies. It caused moisture stress for the standing crops, livestock and fisheries and affected the livelihood of the people to a large extent. In this context, the present study has been carried out to make the socio-economic impact assessment of livelihood security in agriculture, animal husbandry and aquaculture on the tsunami-hit lands of Andaman. Data have been collected from 150 sample respondents and the survey has been conducted for two periods, pre-tsunami and post-tsunami. The results have indicated that tsunami has ravaged the households, standing crops, farm inputs such as seed, feed and implements, livestock and poultry population, their sheds, fish ponds, etc., thereby affecting the basic livelihood security of the people in Andaman. The rehabilitation measures taken by the government and NGOs have improved their livelihoods by reviving agriculture considerably in the subsequent years and by creating employment opportunities in various farm and non-farm activities. The paper has suggested creating profitable livelihood security to vulnerable sections of the society in the existing socio-economic penury with holistic intervention of the community, government and NGOs.Agricultural and Food Policy,
Compact steep-spectrum sources from the S4 sample
We present the results of 5-GHz observations with the VLA A-array of a sample
of candidate Compact Steep Spectrum sources (CSSs) selected from the S4 survey.
We also estimate the symmetry parameters of high-luminosity CSSs selected from
different samples of radio sources, and compare these with the larger sources
of similar luminosity to understand their evolution and the consistency of the
CSSs with the unified scheme for radio galaxies and quasars. The majority of
CSSs are likely to be young sources advancing outwards through a dense
asymmetric environment. The radio properties of CSSs are found to be consistent
with the unified scheme, in which the axes of the quasars are observed close to
the line of sight, while radio galaxies are observed close to the plane of the
sky.Comment: accepted for publication in mnras; 8 pages, figure 1 with 21 images,
and two additional figures; 2 table
Mass of the black hole in the Seyfert 1.5 galaxy H 0507+164 from reverberation mapping
We present the results of our optical monitoring campaign of the X-ray source H 0507+164, a low-luminosity Seyfert 1.5 galaxy at a redshift z= 0.018. Spectroscopic observations were carried out during 22 nights in 2007, from 2007 November 21 to 2007 December 26. Photometric observations in the R band for 13 nights were also obtained during the same period. The continuum and broad-line fluxes of the galaxy were found to vary during our monitoring period. The R-band differential light curve with respect to a companion star also shows a similar variability. Using cross-correlation analysis, we estimated a time-delay of τcen= 3.01+0.42− 1.84 d (in the rest frame) of the response of the broad Hβ line fluxes to the variations in the optical continuum at 5100 Å. Using this time-delay and the width of the Hβ line, we estimated the radius for the broad-line region of 2.53+0.35− 1.55× 10−3 pc and a black hole mass of 9.62+0.33− 3.73× 106 M
Long-Term Optical Flux and Colour Variability in Quasars
We have used optical V and R band observations from the Massive Compact Halo Object (MACHO) project on a sample of 59 quasars behind the Magellanic clouds to study their long term optical flux and colour variations. These quasars, lying in the redshift range of 0.2 < z < 2.8 and having apparent V band magnitudes between 16.6 and 20.1 mag, have observations ranging from 49 to 1353 epochs spanning over 7.5 yr with frequency of sampling between 2 to 10 days. All the quasars show variability during the observing period. The normalised excess variance (Fvar) in V and R bands are in the range 0.2% < FVvar < 1.6% and 0.1% < FRvar < 1.5% respectively. In a large fraction of the sources, Fvar is larger in the V band compared to the R band. From the z-transformed discrete cross-correlation function analysis, we find that there is no lag between the V and R band variations. Adopting the Markov Chain Monte Carlo (MCMC) approach, and properly taking into account the correlation between the errors in colours and magnitudes, it is found that the majority of sources show a bluer when brighter trend, while a minor fraction of quasars show the opposite behaviour. This is similar to the results obtained from another two independent algorithms, namely the weighted linear least squares fit (FITEXY) and the bivariate correlated errors and intrinsic scatter regression (BCES). However, the ordinary least squares (OLS) fit, normally used in the colour variability studies of quasars, indicates that all the quasars studied here show a bluer when brighter trend. It is therefore very clear that the OLS algorithm cannot be used for the study of colour variability in quasars
NUMFabric: Fast and Flexible Bandwidth Allocation in Datacenters
We present xFabric, a novel datacenter transport design that provides flexible and fast bandwidth allocation control. xFabric is flexible: it enables operators to specify how bandwidth is allocated amongst contending flows to optimize for different service-level objectives such as minimizing flow completion times, weighted allocations, different notions of fairness, etc. xFabric is also very fast, it converges to the specified allocation one-to-two order of magnitudes faster than prior schemes. Underlying xFabric, is a novel distributed algorithm that uses in-network packet scheduling to rapidly solve general network utility maximization problems for bandwidth allocation. We evaluate xFabric using realistic datacenter topologies and highly dynamic workloads and show that it is able to provide flexibility and fast convergence in such stressful environments.Google Faculty Research Awar
Fastpass: A Centralized “Zero-Queue” Datacenter Network
An ideal datacenter network should provide several properties, including low median and tail latency, high utilization (throughput), fair allocation of network resources between users or applications, deadline-aware scheduling, and congestion (loss) avoidance. Current datacenter networks inherit the principles that went into the design of the Internet, where packet transmission and path selection decisions are distributed among the endpoints and routers. Instead, we propose that each sender should delegate control—to a centralized arbiter—of when each packet should be transmitted and what path it should follow. This paper describes Fastpass, a datacenter network architecture built using this principle. Fastpass incorporates two fast algorithms: the first determines the time at which each packet should be transmitted, while the second determines the path to use for that packet. In addition, Fastpass uses an efficient protocol between the endpoints and the arbiter and an arbiter replication strategy for fault-tolerant failover. We deployed and evaluated Fastpass in a portion of Facebook’s datacenter network. Our results show that Fastpass achieves high throughput comparable to current networks at a 240 reduction is queue lengths (4.35 Mbytes reducing to 18 Kbytes), achieves much fairer and consistent flow throughputs than the baseline TCP (5200 reduction in the standard deviation of per-flow throughput with five concurrent connections), scalability from 1 to 8 cores in the arbiter implementation with the ability to schedule 2.21 Terabits/s of traffic in software on eight cores, and a 2.5 reduction in the number of TCP retransmissions in a latency-sensitive service at Facebook.National Science Foundation (U.S.) (grant IIS-1065219)Irwin Mark Jacobs and Joan Klein Jacobs Presidential FellowshipHertz Foundation (Fellowship
Three episodes of jet activity in the FRII radio galaxy B0925+420
We present Very Large Array images of a "Double-Double Radio Galaxy", a class
of objects in which two pairs of lobes are aligned either side of the nucleus.
In this object, B0925+420, we discover a third pair of lobes, close to the core
and again in alignment with the other lobes. This first-known "Triple-Double"
object strongly increases the likelihood that these lobes represent mutiple
episodes of jet activity, as opposed to knots in an underlying jet. We model
the lobes in terms of their dynamical evolution. We find that the inner pair of
lobes is consistent with the outer pair having been displaced buoyantly by the
ambient medium. The middle pair of lobes is more problematic - to the extent
where an alternative model interpreting the middle and inner "lobes" as
additional bow shocks within the outer lobes may be more appropriate - and we
discuss the implications of this on our understanding of the density of the
ambient medium.Comment: Accepted for publication in MNRAS. Figure 2 is best viewed in colou
CSSs in a sample of B2 radio sources of intermediate strength
We present radio observations of 19 candidate compact steep-spectrum (CSS)
objects selected from a well-defined, complete sample of 52 B2 radio sources of
intermediate strength. These observations were made with the VLA A-array at
4.835 GHz. The radio structures of the entire sample are summarised and the
brightness asymmetries within the compact sources are compared with those of
the more extended ones, as well as with those in the 3CRR sample and the CSSs
from the B3-VLA sample. About 25 per cent of the CSS sources exhibit large
brightness asymmetries, with a flux density ratio for the opposing lobes of
5, possibly due to interaction of the jets with infalling material. The
corresponding percentage for the larger-sized objects is only about 5 per cent.
We also investigate possible dependence of the flux density asymmetry of the
lobes on redshift, since this might be affected by more interactions and
mergers in the past. No such dependence is found. A few individual objects of
interest are discussed in the paper.Comment: 10 pages, 7 figures, 2 tables; accepted for publication in Astronomy
and Astrophysic
Algorithm Engineering in Robust Optimization
Robust optimization is a young and emerging field of research having received
a considerable increase of interest over the last decade. In this paper, we
argue that the the algorithm engineering methodology fits very well to the
field of robust optimization and yields a rewarding new perspective on both the
current state of research and open research directions.
To this end we go through the algorithm engineering cycle of design and
analysis of concepts, development and implementation of algorithms, and
theoretical and experimental evaluation. We show that many ideas of algorithm
engineering have already been applied in publications on robust optimization.
Most work on robust optimization is devoted to analysis of the concepts and the
development of algorithms, some papers deal with the evaluation of a particular
concept in case studies, and work on comparison of concepts just starts. What
is still a drawback in many papers on robustness is the missing link to include
the results of the experiments again in the design
- …
