3,865 research outputs found

    Indirect Network Effects and Adoption Externalities

    Get PDF
    The conventional wisdom is that indirect network effects, unlike direct network effects, do not give rise to externalities. In this paper we show that under very general conditions, indirect network effects lead to adoption externalities. In particular we show that in markets where consumption benefits arise from hardware/software systems, adoption externalities will occur when there are (i) increasing returns to scale in the production of software, (ii) free-entry in software, and (iii) consumers have a preference for software variety. The private benefit of the marginal hardware purchaser is less than the social benefit since the marginal hardware purchaser does not internalize the welfare improving response of the software industry, particularly the increase in software variety, on inframarginal purchasers when the market for hardware expands.Network Externalities, Network Effects

    Low-Mass X-ray Binaries and Globular Clusters in Early-Type Galaxies. I. Chandra Observations

    Full text link
    We present a Chandra survey of LMXBs in 24 early-type galaxies. Correcting for detection incompleteness, the X-ray luminosity function (XLF) of each galaxy is consistent with a powerlaw with negative logarithmic differential slope, beta~2.0. However, beta strongly correlates with incompleteness, indicating the XLF flattens at low-Lx. The composite XLF is well-fitted by a powerlaw with a break at 2.21(+0.65,-0.56)E38 erg/s and beta=1.40(+0.10,-0.13) and 2.84(+0.39,-0.30) below and above it, respectively. The break is close to the Eddington limit for a 1.4Msun neutron-star, but the XLF shape rules out its representing the division between neutron-star and black-hole systems. Although the XLFs are similar, we find evidence of some variation between galaxies. The high-Lx XLF slope does not correlate with age, but may correlate with [alpha/Fe]. Considering only LMXBs with Lx>1E37 erg/s, matching the LMXBs with globular clusters (GCs) identified in HST observations of 19 of the galaxies, we find the probability a GC hosts an LMXB is proportional to LGC^alpha ZFe^gamma} where alpha=1.01+/-0.19 and gamma=0.33+/-0.11. Correcting for GC luminosity and colour effects, and detection incompleteness, we find no evidence that the fraction of LMXBs with Lx>1e37 erg/s in GCs (40%), or the fraction of GCs hosting LMXBs (~6.5%) varies between galaxies. The spatial distribution of LMXBs resembles that of GCs, and the specific frequency of LMXBs is proportional to the GC specific luminosity, consistent with the hypothesis that all LMXBs form in GCs. If the LMXB lifetime is tau and the duty cycle is Fd, our results imply ~1.5 (tau/1E8 yr)^-1 /Fd LMXBs are formed per Gyr per GC and we place an upper limit of 1 active LMXB in the field per 3.4E9Lsun of V-band luminosity.Comment: 24 pages, 17 figures and 6 tables. Accepted for publication in the Astrophysical Journal. Expanded discussion and various minor revisions to improve robustness of results. Conclusions unchange

    An empirical study of software design practices

    Get PDF
    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective

    Defending Australia: a history of Australia’s defence white papers

    Get PDF
    This paper provides a summary of each of Australia’s defence white papers issued between 1976 and 2013 and seeks to draw out common themes that emerge in some or all of them. Executive summary Australia published defence white papers in 1976, 1987, 1994, 2000, 2009 and 2013 and a new white paper is expected in 2015. A community consultation process was undertaken as part of the 2000 and 2009 defence white papers and a similar process is being carried out for the upcoming 2015 defence white paper. The need to defend Australia against a major aggressor remains the primary driver in Australian defence policy. Regional security and contributing to the global order have been secondary, but still important priorities in Australian defence planning. Each of the defence white papers has been created on the basis that Australia should be able to defend itself against a potential aggressor without outside assistance (the principle of self-reliance), while at the same time stressing the importance of the alliance with the United States. Threat perceptions have changed from the Cold War influences reflected in the 1976 and 1987 white papers to a contemporary focus on terrorism while also incorporating emerging threats such as cyber attacks and the rise of China. Defence white papers are not produced in a vacuum but are informed by key reviews of Australia’s strategic situation, industry policy and force posture. Defence policy is subject to the broader economic conditions of the time and the Department of Defence must contend with many other priorities for government funding. The financial plans set out in the various defence white papers are often ambitious and rarely brought to fruition. On the whole, capability choices have displayed continuity between the different white papers regardless of changes in government. This is understandable given the length of time required for major capital equipment acquisitions. Recent white papers have placed a greater emphasis on regional engagement. The contribution of the Australian Defence Force (ADF) to humanitarian assistance and disaster relief operations, as well as to border protection activities, has also been included in the most recent white papers

    Designing with Ada for satellite simulation: A case study

    Get PDF
    A FORTRAN-operated and an Ada-oriented design for the same system are compared to learn whether an essentially different design was produced using Ada. The designs were produced by an experiment that involves the parallel development of software for a spacecraft dynamics simulator. Design differences are identified in the use of abstractions, system structure, and simulator operations. Although the designs were significantly different, this result may be influenced by some special characteristics discussed

    Comparing external ventricular drains-related ventriculitis surveillance definitions

    Get PDF
    OBJECTIVETo evaluate the agreement between the current National Healthcare Safety Network (NHSN) definition for ventriculitis and others found in the literature among patients with an external ventricular drain (EVD)DESIGNRetrospective cohort study from January 2009 to December 2014SETTINGNeurology and neurosurgery intensive care unit of a large tertiary-care centerPATIENTSPatients with an EVD were included. Patients with an infection prior to EVD placement or a permanent ventricular shunt were excluded.METHODSWe reviewed the charts of patients with positive cerebrospinal fluid (CSF) cultures and/or abnormal CSF results while they had an EVD in place and applied various ventriculitis definitions.RESULTSWe identified 48 patients with a total of 52 cases of ventriculitis (41 CSF culture-positive cases and 11 cases based on abnormal CSF test results) using the NHSN definition. The most common organisms causing ventriculitis were gram-positive commensals (79.2%); however, 45% showed growth of only 1 colony on 1 piece of media. Approximately 60% of the ventriculitis cases by the NHSN definition met the Honda criteria, approximately 56% met the Gozal criteria, and 23% met Citerio’s definition. Cases defined using Honda versus Gozal definitions had a moderate agreement (κ=0.528; P&lt;.05) whereas comparisons of Honda versus Citerio definitions (κ=0.338; P&lt;.05) and Citerio versus Gozal definitions (κ=0.384; P&lt;.05) had only fair agreements.CONCLUSIONSThe agreement between published ventriculostomy-associated infection (VAI) definitions in this cohort was moderate to fair. A VAI surveillance definition that better defines contaminants is needed for more homogenous application of surveillance definitions between institutions and better comparison of rates.Infect Control Hosp Epidemiol 2017;38:574–579</jats:sec

    TRIDENT: an Infrared Differential Imaging Camera Optimized for the Detection of Methanated Substellar Companions

    Full text link
    A near-infrared camera in use at the Canada-France-Hawaii Telescope (CFHT) and at the 1.6-m telescope of the Observatoire du Mont-Megantic is described. The camera is based on a Hawaii-1 1024x1024 HgCdTe array detector. Its main feature is to acquire three simultaneous images at three wavelengths across the methane absorption bandhead at 1.6 microns, enabling, in theory, an accurate subtraction of the stellar point spread function (PSF) and the detection of faint close methanated companions. The instrument has no coronagraph and features fast data acquisition, yielding high observing efficiency on bright stars. The performance of the instrument is described, and it is illustrated by laboratory tests and CFHT observations of the nearby stars GL526, Ups And and Chi And. TRIDENT can detect (6 sigma) a methanated companion with delta H = 9.5 at 0.5" separation from the star in one hour of observing time. Non-common path aberrations and amplitude modulation differences between the three optical paths are likely to be the limiting factors preventing further PSF attenuation. Instrument rotation and reference star subtraction improve the detection limit by a factor of 2 and 4 respectively. A PSF noise attenuation model is presented to estimate the non-common path wavefront difference effect on PSF subtraction performance.Comment: 41 pages, 16 figures, accepted for publication in PAS

    Manipulation of length and lexicality localizes the functional neuroanatomy of phonological processing in adult readers

    Get PDF
    In a previous study of single word reading, regions in the left supramarginal gyrus and left angular gyrus showed positive BOLD activity in children but significantly less activity in adults for high-frequency words. This developmental decrease may reflect decreased reliance on phonological processing for familiar stimuli in adults. Therefore, in the present study, variables thought to influence phonological demand (string length and lexicality) were manipulated. Length and lexicality effects in the brain were explored using both ROI and whole-brain approaches. In the ROI analysis, the supramarginal and angular regions from the previous study were applied to this study. The supramarginal region showed a significant positive effect of length, consistent with a role in phonological processing, whereas the angular region showed only negative deflections from baseline with a strong effect of lexicality and other weaker effects. At the whole-brain level, varying effects of length and lexicality and their interactions were observed in 85 regions throughout the brain. The application of hierarchical clustering analysis to the BOLD time course data derived from these regions revealed seven clusters, with potentially revealing anatomical locations. Of note, a left angular gyrus region was the sole constituent of one cluster. Taken together, these findings in adult readers (1) provide support for a widespread set of brain regions affected by lexical variables, (2) corroborate a role for phonological processing in the left supramarginal gyrus, and (3) do not support a strong role for phonological processing in the left angular gyrus
    corecore