1,325 research outputs found
Recommended from our members
The Neurophysiology of Functionally Meaningful Categories: Macaque Ventrolateral Prefrontal Cortex Plays a Critical Role in Spontaneous Categorization of Species-Specific Vocalizations
Neurophysiological studies in nonhuman primates have demonstrated that the prefrontal cortex (PFC) plays a critical role in the acquisition of learned categories following training. What is presently unclear is whether this cortical area also plays a role in spontaneous recognition and discrimination of natural categories. Here, we explore this possibility by recording from neurons in the PFC while rhesus listen to species-specific vocalizations that vary in terms of their social function and acoustic morphology. We found that ventral prefrontal cortex (vPFC) activity, on average, did not differentiate between food calls that were associated with the same functional category, despite having different acoustic properties. In contrast, vPFC activity differentiated between food calls associated with different functional classes and specifically, information about the quality and motivational value of the food. These results suggest that the vPFC is involved in the categorization of socially meaningful signals, thereby both extending its previously conceived role in the acquisition of learned categories and showing the significance of using natural categorical distinctions in the study of neural mechanisms.Psycholog
Understanding longitudinal bi-ventricular structural and functional changes in a Pulmonary Hypertension Sugen-Hypoxia rat model by Cardiac Magnetic Resonance Imaging
Recommended from our members
Choosing a Data Model and Query Language for Provenance
The ancestry relationships found in provenance form a directed graph. Many provenance queries require traversal of this graph. The data and query models for provenance should directly and naturally address this graph-centric nature of provenance. To that end, we set out the requirements for a provenance data and query model and discuss why the common solutions (relational, XML, RDF) fall short. A semistructured data model is more suited for handling provenance. We propose a query model based on the Lorel query language, and briefly describe how our query language PQL extends Lorel.Engineering and Applied Science
Facies Relationships Within the Glens Falls Limestone of Vermont and New York
Guidebook for field trips in Vermont: New England Intercollegiate Geological Conference, 79th annual meeting, October 16, 17 and 18, 1987: Trips A-
Explaining Myanmar's Regime Transition: The Periphery is Central
In 2010, Myanmar (Burma) held its first elections after 22 years of direct military rule. Few compelling explanations for this regime transition have emerged. This article critiques popular accounts and potential explanations generated by theories of authoritarian ‘regime breakdown’ and ‘regime maintenance’. It returns instead to the classical literature on military intervention and withdrawal. Military regimes, when not terminated by internal factionalism or external unrest, typically liberalise once they feel they have sufficiently addressed the crises that prompted their seizure of power. This was the case in Myanmar. The military intervened for fear that political unrest and ethnic-minority separatist insurgencies would destroy Myanmar’s always-fragile territorial integrity and sovereignty. Far from suddenly liberalising in 2010, the regime sought to create a ‘disciplined democracy’ to safeguard its preferred social and political order twice before, but was thwarted by societal opposition. Its success in 2010 stemmed from a strategy of coercive state-building and economic incorporation via ‘ceasefire capitalism’, which weakened and co-opted much of the opposition. Having altered the balance of forces in its favour, the regime felt sufficiently confident to impose its preferred settlement. However, the transition neither reflected total ‘victory’ for the military nor secured a genuine or lasting peace
Recommended from our members
Layering in Provenance Systems
Digital provenance describes the ancestry or history of a digital object. Most existing provenance systems, however, operate at only one level of abstraction: the sys- tem call layer, a workflow specification, or the high-level constructs of a particular application. The provenance collectable in each of these layers is different, and all of it can be important. Single-layer systems fail to account for the different levels of abstraction at which users need to reason about their data and processes. These systems cannot integrate data provenance across layers and cannot answer questions that require an integrated view of the provenance.
We have designed a provenance collection structure facilitating the integration of provenance across multiple levels of abstraction, including a workflow engine, a web browser, and an initial runtime Python provenance tracking wrapper. We layer these components atop provenance-aware network storage (NFS) that builds upon a Provenance-Aware Storage System (PASS). We discuss the challenges of building systems that integrate provenance across multiple layers of abstraction, present how we augmented systems in each layer to integrate provenance, and present use cases that demonstrate how provenance spanning multiple layers provides functionality not available in existing systems. Our evaluation shows that the overheads imposed by layering provenance systems are reasonable.Engineering and Applied Science
Biogeochemistry of manganese in ferruginous Lake Matano, Indonesia
This study explores Mn biogeochemistry in a stratified, ferruginous lake, a modern analogue to ferruginous oceans. Intense Mn cycling occurs in the chemocline where Mn is recycled at least 15 times before sedimentation. The product of biologically catalyzed Mn oxidation in Lake Matano is birnessite. Although there is evidence for abiotic Mn reduction with Fe(II), Mn reduction likely occurs through a variety of pathways. The flux of Fe(II) is insufficient to balance the reduction of Mn at 125 m depth in the water column, and Mn reduction could be a significant contributor to CH<sub>4</sub> oxidation. By combining results from synchrotron-based X-ray fluorescence and X-ray spectroscopy, extractions of sinking particles, and reaction transport modeling, we find the kinetics of Mn reduction in the lake's reducing waters are sufficiently rapid to preclude the deposition of Mn oxides from the water column to the sediments underlying ferruginous water. This has strong implications for the interpretation of the sedimentary Mn record
Coronary CT Angiography and 5-Year Risk of Myocardial Infarction.
BACKGROUND: Although coronary computed tomographic angiography (CTA) improves diagnostic certainty in the assessment of patients with stable chest pain, its effect on 5-year clinical outcomes is unknown. METHODS: In an open-label, multicenter, parallel-group trial, we randomly assigned 4146 patients with stable chest pain who had been referred to a cardiology clinic for evaluation to standard care plus CTA (2073 patients) or to standard care alone (2073 patients). Investigations, treatments, and clinical outcomes were assessed over 3 to 7 years of follow-up. The primary end point was death from coronary heart disease or nonfatal myocardial infarction at 5 years. RESULTS: The median duration of follow-up was 4.8 years, which yielded 20,254 patient-years of follow-up. The 5-year rate of the primary end point was lower in the CTA group than in the standard-care group (2.3% [48 patients] vs. 3.9% [81 patients]; hazard ratio, 0.59; 95% confidence interval [CI], 0.41 to 0.84; P=0.004). Although the rates of invasive coronary angiography and coronary revascularization were higher in the CTA group than in the standard-care group in the first few months of follow-up, overall rates were similar at 5 years: invasive coronary angiography was performed in 491 patients in the CTA group and in 502 patients in the standard-care group (hazard ratio, 1.00; 95% CI, 0.88 to 1.13), and coronary revascularization was performed in 279 patients in the CTA group and in 267 in the standard-care group (hazard ratio, 1.07; 95% CI, 0.91 to 1.27). However, more preventive therapies were initiated in patients in the CTA group (odds ratio, 1.40; 95% CI, 1.19 to 1.65), as were more antianginal therapies (odds ratio, 1.27; 95% CI, 1.05 to 1.54). There were no significant between-group differences in the rates of cardiovascular or noncardiovascular deaths or deaths from any cause. CONCLUSIONS: In this trial, the use of CTA in addition to standard care in patients with stable chest pain resulted in a significantly lower rate of death from coronary heart disease or nonfatal myocardial infarction at 5 years than standard care alone, without resulting in a significantly higher rate of coronary angiography or coronary revascularization. (Funded by the Scottish Government Chief Scientist Office and others; SCOT-HEART ClinicalTrials.gov number, NCT01149590 .)
Systems, interactions and macrotheory
A significant proportion of early HCI research was guided by one very clear vision: that the existing theory base in psychology and cognitive science could be developed to yield engineering tools for use in the interdisciplinary context of HCI design. While interface technologies and heuristic methods for behavioral evaluation have rapidly advanced in both capability and breadth of application, progress toward deeper theory has been modest, and some now believe it to be unnecessary. A case is presented for developing new forms of theory, based around generic “systems of interactors.” An overlapping, layered structure of macro- and microtheories could then serve an explanatory role, and could also bind together contributions from the different disciplines. Novel routes to formalizing and applying such theories provide a host of interesting and tractable problems for future basic research in HCI
Administering the Auditory Comprehension Test to a group of learning disabled subjects
This study attempted to replicate the finding by Green
and Josey (1988) in some groups of learning disabled
children of better comprehension of spoken language in
one single ear (monaural condition) than in both ears
together (binaural condition). The Auditory
Comprehension Test (ACT) which is designed specifically
to measure this "binaural deficit" was administered to
36 learning disabled children, from which a subgroup of
learning disabled subjects judged by teachers to have
prominent difficulty comprehending everyday speech was
later selected, and a control group of 36 non-learning
disabled children individually matched for age, sex,
and IQ with the learning disabled children. The ACT
involves presenting short news item-style stories via
headphones to either ear alone, or both ears
simultaneously. After each story the subject repeats
as much of the story as s/he can remember. The
resulting three scores (left ear, right ear, and both
ears simultaneously) are compared to determine if
listening with either single ear produces better
comprehension than listening with both ears together (i.e. to see if a binaural deficit exists).
Comparisons between the control and learning disabled
groups revealed significant differences in the
direction of (1) higher average test scores for the
control group, and (2) higher overall binaural deficits
for the learning disabled group, as well as a larger
number of subjects in the learning disabled group
having a binaural deficit. The control group also
performed significantly poorer in the binaural
condition than in either single ear alone, indicating a
possible bias in the ACT itself, and/or a possible
selection bias. The test bias points to the need for
revisions to the ACT in its application to children
- …
