3,067 research outputs found
Do science-technology interactions pay off when developing technology? An exploratory investigation of 10 science-intensive technology domains.
In this paper we investigate the impact of science – technology (S&T) interactions on the effectiveness of technology development. The number of references in patents to scientific articles is considered as an approximation of the intensity of S&T interaction whereas a country's technological performance is measured both in terms of its technological productivity (i.e. number of patents per capita), and its relative technological specialization (i.e. RTA-index). We use USPTO patent-data for eight European countries in ten technological domains. A variance analysis (ANOVA) is applied. Country as an independent variable does not explain a significant portion of the observed variance in science interaction intensity (p=0.25). Technology domain, however, explains a significant portion of the observed variance (pScience; Effectiveness; Patents; Country; Performance; Variance analysis;
Randomized controlled trials are needed to close the evidence gap in the prevention of preterm birth
Pregnant women have been advised to avoid heavy lifting during
pregnancy due to concerns of adverse pregnancy outcomes including
premature delivery. To date there is no evidence on the effectiveness
of advice in preventing preterm birth as found in a recent systematic
search and appraisal of published literature. This letter employs the
findings of the review to inform future studies
Eliminating Biases in Evaluating Mutual Fund Performance from a Survivorship Free Sample
Poor performing mutual funds are less likely to be observed in the data sets that are typically available. This so-called survivor problem can induce a substantial bias in measures of the performance of the funds and the persistence of this performance. Many studies have recently argued that survivorship bias can be avoided by analyzing a sample that contains returns on each fund up to the period of disappearance using standard techniques. Such data sets are usually referred to as 'survivorship free'. In this paper we show that the use of standard methods of analysis on a 'survivorship free' data-set typically still suffers from a bias and we show how one can easily correct for this using weights based on probit regressions. Using a sample with quarterly returns on U.S. based equity funds, we first of all model how survival probabilities depend upon historical returns, the age of the fund and upon aggregate economy-wide shocks. Subsequently we employ a Monte Carlo study to analyze the size and shape of the survivorship bias in various performance measures that arise when a 'survivorship free database' is used with standard techniques. In particular, we show that survivorship bias induces a spurious U-shape pattern in performance persistence. Finally, we show how a weighting procedure based upon probit regressions can be used to correct for the bias. In this way, we obtain bias-corrected estimates of abnormal performance relative to a one-factor and the Carhart [1997] four-factor model, as well as its persistence. Our results are in accordance with the persistence pattern found by Carhart [1997], and do not support the existence of a hot hand phenomenon in mutual fund performance.
Pilot scale pyrolysis - determination of critical moisture content for sustainable organic waste pyrolysis
Economic feasibility of large scale organic waste pyrolysis was investigated for Inghams Enterprise (Waitoa) chicken dissolved air flotation sludge (DAF) and activated sludge (biosolids) from the Hamilton municipal waste water treatment plant. Processing data was obtained from pilot plant trials using the Lakeland Steel (Rotorua) continuous auger pyrolysis plant using feedstock at 15, 30, 45 and ~80% moisture contents. Economics were calculated based on estimated capital and operating costs of a large scale facility, revenue from selling char, savings from landfill diversion (including transportation and gate costs), energy savings by recycling syngas product and using waste heat for drying feedstock.
For DAF, 15% moisture content gave yields of 21% syngas, 27% char, and 52% oil (dry weight basis). 15% moisture content gave the best processing conditions based on handling properties and degree of autogenesis. The DAF case does not give a payback period due to low scale of operations.
For biosolids, 15% moisture content feedstock gave yields of 46% syngas, 31% char, and 21% oil (wet weight). Difficulties were found with plant blockages at 45% and 80% moisture contents. 15% moisture content gave the best processing conditions and the best economic performance with a payback time of 4.6 years for a facility that could process 11,000 tonnes per year
Colored Non-Crossing Euclidean Steiner Forest
Given a set of -colored points in the plane, we consider the problem of
finding trees such that each tree connects all points of one color class,
no two trees cross, and the total edge length of the trees is minimized. For
, this is the well-known Euclidean Steiner tree problem. For general ,
a -approximation algorithm is known, where is the
Steiner ratio.
We present a PTAS for , a -approximation algorithm
for , and two approximation algorithms for general~, with ratios
and
AXES at TRECVID 2012: KIS, INS, and MED
The AXES project participated in the interactive instance search task (INS), the known-item search task (KIS), and the multimedia event detection task (MED) for TRECVid 2012. As in our TRECVid 2011 system, we used nearly identical search systems and user interfaces for both INS and KIS. Our interactive INS and KIS systems focused this year on using classifiers trained at query time with positive examples collected from external search engines. Participants in our KIS experiments were media professionals from the BBC; our INS experiments were carried out by students and researchers at Dublin City University. We performed comparatively well in both experiments. Our best KIS run found 13 of the 25 topics, and our best INS runs outperformed all other submitted runs in terms of P@100. For MED, the system presented was based on a minimal number of low-level descriptors, which we chose to be as large as computationally feasible. These descriptors are aggregated to produce high-dimensional video-level signatures, which are used to train a set of linear classifiers. Our MED system achieved the second-best score of all submitted runs in the main track, and best score in the ad-hoc track, suggesting that a simple system based on state-of-the-art low-level descriptors can give relatively high performance. This paper describes in detail our KIS, INS, and MED systems and the results and findings of our experiments
On Negotiation as Concurrency Primitive
We introduce negotiations, a model of concurrency close to Petri nets, with
multiparty negotiation as primitive. We study the problems of soundness of
negotiations and of, given a negotiation with possibly many steps, computing a
summary, i.e., an equivalent one-step negotiation. We provide a complete set of
reduction rules for sound, acyclic, weakly deterministic negotiations and show
that, for deterministic negotiations, the rules compute the summary in
polynomial time
Plasma amyloid-β levels, cerebral atrophy and risk of dementia: A population-based study
Background: Plasma amyloid-β (Aβ) levels are increasingly studied as a potential accessible marker of cognitive impairment and dementia. However, it remains underexplored whether plasma Aβ levels including the novel Aβ peptide 1-38 (Aβ1-38) relate to preclinical markers of neurodegeneration and risk of dementia. We investigated the association of plasma Aβ1-38, Aβ1-40, and Aβ1-42 levels with imaging markers of neurodegeneration and risk of dementia in a prospective population-based study. Methods: We analyzed plasma Aβ levels in 458 individuals from the Rotterdam Study. Brain volumes, including gray matter, white matter, and hippocampus, were computed on the basis of 1.5-T magnetic resonance imaging (MRI). Dementia and its subtypes were defined on the basis of internationally accepted criteria. Results: A total of 458 individuals (mean age, 67.8 ± 7.7 yr; 232 [50.7%] women) with baseline MRI scans and incident dementia were included. The mean ± SD values of Aβ1-38, Aβ1-40, and Aβ1-42 (in pg/ml) were 19.4 ± 4.3, 186.1 ± 35.9, and 56.3 ± 6.2, respectively, at baseline. Lower plasma Aβ1-42 levels were associated with smaller hippocampal volume (mean difference in hippocampal volume per SD decrease in Aβ1-42 levels, - 0.13; 95% CI, - 0.23 to - 0.04; p = 0.007). After a mean follow-up of 14.8 years (SD, 4.9; range, 4.1-23.5 yr), 79 persons developed dementia, 64 of whom were diagnosed with Alzheimer's disease (AD). Lower levels of Aβ1-38 and Aβ1-42 were associated with increased risk of dementia, specifically AD (HR for AD per SD decrease in Aβ1-38 levels, 1.39; 95% CI, 1.00-2.16; HR for AD per SD decrease in Aβ1-42 levels, 1.35; 95% CI, 1.05-1.75) after adjustment for age, sex, education, cardiovascular risk factors, apolipoprotein E ϵ4 allele carrier status, and other Aβ isoforms. Conclusions: Our results show that lower plasma Aβ levels were associated with risk of dementia and incident AD. Moreover, lower plasma Aβ1-42 levels were related to smaller hippocampal volume. These results suggest that plasma Aβ1-38 and Aβ1-42 maybe useful biomarkers for identification of individuals at risk of dementia
- …
