2,083 research outputs found
Ontologies for the study of neurological disease
We have begun work on two separate but related ontologies for the study of neurological diseases. The first, the Neurological Disease Ontology (ND), is intended to provide a set of controlled, logically connected classes to describe the range of neurological diseases and their associated signs and symptoms, assessments, diagnoses, and interventions that are encountered in the course of clinical practice. ND is built as an extension of the Ontology for General Medical Sciences — a high-level candidate OBO Foundry ontology that provides a set of general classes that can be used to describe general aspects of medical science. ND is being built with classes utilizing both textual and axiomatized definitions that describe and formalize the relations between instances of other classes within the ontology itself as well as to external ontologies such as the Gene Ontology, Cell Ontology, Protein Ontology, and Chemical Entities of Biological Interest. In addition, references to similar or associated terms in external ontologies, vocabularies and terminologies are included when possible. Initial work on ND is focused on the areas of Alzheimer’s and other diseases associated with dementia, multiple sclerosis, and stroke and cerebrovascular disease. Extensions to additional groups of neurological diseases are planned. The second ontology, the Neuro-Psychological Testing Ontology (NPT), is intended to provide a set of classes for the annotation of neuropsychological testing data. The intention of this ontology is to allow for the integration of results from a variety of neuropsychological tests that assay similar measures of cognitive functioning. Neuro-psychological testing is an important component in developing the clinical picture used in the diagnosis of patients with a range of neurological diseases, such as Alzheimer’s disease and multiple sclerosis, and following stroke or traumatic brain injury. NPT is being developed as an extension to the Ontology for Biomedical Investigations
Modularization for the Cell Ontology
One of the premises of the OBO Foundry is that development of an orthogonal set of ontologies will increase domain expert contributions and logical interoperability, and decrease maintenance workload. For these reasons, the Cell Ontology (CL) is being re-engineered. This process requires the extraction of sub-modules from existing OBO ontologies, which presents a number of practical engineering challenges. These extracted modules may be intended to cover a narrow or a broad set of species. In addition, applications and resources that make use of the Cell Ontology have particular modularization requirements, such as the ability to extract custom subsets or unions of the Cell Ontology with other OBO ontologies. These extracted modules may be intended to cover a narrow or a broad set of species, which presents unique complications.

We discuss some of these requirements, and present our progress towards a customizable simple-to-use modularization tool that leverages existing OWL-based tools and opens up their use for the CL and other ontologies
An improved ontological representation of dendritic cells as a paradigm for all cell types
The Cell Ontology (CL) is designed to provide a standardized representation of cell types for data annotation. Currently, the CL employs multiple is_a relations, defining cell types in terms of histological, functional, and lineage properties, and the majority of definitions are written with sufficient generality to hold across multiple species. This approach limits the CL’s utility for cross-species data integration. To address this problem, we developed a method for the ontological representation of cells and applied this method to develop a dendritic cell ontology (DC-CL). DC-CL subtypes are delineated on the basis of surface protein expression, systematically including both species-general and species-specific types and optimizing DC-CL for the analysis of flow cytometry data. This approach brings benefits in the form of increased accuracy, support for reasoning, and interoperability with other ontology resources.
104. Barry Smith, “Toward a Realistic Science of Environments”, Ecological Psychology, 2009, 21 (2), April-June, 121-130.
Abstract: The perceptual psychologist J. J. Gibson embraces a radically externalistic view of mind and action. We have, for Gibson, not a Cartesian mind or soul, with its interior theater of contents and the consequent problem of explaining how this mind or soul and its psychological environment can succeed in grasping physical objects external to itself. Rather, we have a perceiving, acting organism, whose perceptions and actions are always already tuned to the parts and moments, the things and surfaces, of its external environment. We describe how on this basis Gibson sought to develop a realist science of environments which will be ‘consistent with physics, mechanics, optics, acoustics, and chemistry’
Becoming the Synthi-Fou: Stockhausen and the new keyboardism
Dieser Beitrag ist mit Zustimmung des Rechteinhabers aufgrund einer (DFG geförderten) Allianz- bzw. Nationallizenz frei zugänglich.This publication is with permission of the rights owner freely accessible due to an Alliance licence and a national licence (funded by the DFG, German Research Foundation) respectively.Karlheinz Stockhausen embraced the potential of electronic music to generate new timbres and acoustic typologies early in his career. After first experimenting with magnetic tape in works such as Gesang der Jünglinge (1955) and Kontakte (1958–60), he later embraced other synthesis technologies for the production of large-scale spatial electro-acoustic works such as Sirius (1970) and Oktophonie (1990–91). His interest in technological advances in sound design and sound diffusion also managed to penetrate his highly evolved Klavierstücke
Protein Ontology: A controlled structured network of protein entities
The Protein Ontology (PRO; http://proconsortium.org) formally defines protein entities and explicitly represents their major forms and interrelations. Protein entities represented in PRO corresponding to single amino acid chains are categorized by level of specificity into family, gene, sequence and modification metaclasses, and there is a separate metaclass for protein complexes. All metaclasses also have organism-specific derivatives. PRO complements established sequence databases such as UniProtKB, and interoperates with other biomedical and biological ontologies such as the Gene Ontology (GO). PRO relates to UniProtKB in that PRO’s organism-specific classes of proteins encoded by a specific gene correspond to entities documented in UniProtKB entries. PRO relates to the GO in that PRO’s representations of organism-specific protein complexes are subclasses of the organism-agnostic protein complex terms in the GO Cellular Component Ontology. The past few years have seen growth and changes to the PRO, as well as new points of access to the data and new applications of PRO in immunology and proteomics. Here we describe some of these developments
Unbiased analysis of CLEO data at NLO and pion distribution amplitude
We discuss different QCD approaches to calculate the form factor
F^{\gamma^*\gamma\pi}(Q^2) of the \gamma^*\gamma\to\pi^{0} transition giving
preference to the light-cone QCD sum rules (LCSR) approach as being the most
adequate. In this context we revise the previous analysis of the CLEO
experimental data on F^{\gamma^*\gamma\pi}(Q^{2}) by Schmedding and Yakovlev.
Special attention is paid to the sensitivity of the results to the (strong
radiative) \alpha_s-corrections and to the value of the twist-four coupling
\delta^2. We present a full analysis of the CLEO data at the NLO level of
LCSRs, focusing particular attention to the extraction of the relevant
parameters to determine the pion distribution amplitude, i.e., the Gegenbauer
coefficients a_2 and a_4. Our analysis confirms our previous results and also
the main findings of Schmedding and Yakovlev: both the asymptotic, as well as
the Chernyak--Zhitnitsky pion distribution amplitudes are completely excluded
by the CLEO data. A novelty of our approach is to use the CLEO data as a means
of determining the value of the QCD vacuum non-locality parameter \lambda^2_q =
/ =0.4 GeV^2, which specifies the average virtuality of
the vacuum quarks.Comment: 25 pages, 5 figures, 4 tables; format and margins corrected to fit
page size; small changes in the text and correction of misprint
First-year Results of Broadband Spectroscopy of the Brightest Fermi-GBM Gamma-Ray Bursts
We present here our results of the temporal and spectral analysis of a sample
of 52 bright and hard gamma-ray bursts (GRBs) observed with the Fermi Gamma-ray
Burst Monitor (GBM) during its first year of operation (July 2008-July 2009).
Our sample was selected from a total of 253 GBM GRBs based on each event peak
count rate measured between 0.2 and 40MeV. The final sample comprised 34 long
and 18 short GRBs. These numbers show that the GBM sample contains a much
larger fraction of short GRBs, than the CGRO/BATSE data set, which we explain
as the result of our (different) selection criteria and the improved GBM
trigger algorithms, which favor collection of short, bright GRBs over BATSE. A
first by-product of our selection methodology is the determination of a
detection threshold from the GBM data alone, above which GRBs most likely will
be detected in the MeV/GeV range with the Large Area Telescope (LAT) onboard
Fermi. This predictor will be very useful for future multiwavelength GRB follow
ups with ground and space based observatories. Further we have estimated the
burst durations up to 10MeV and for the first time expanded the duration-energy
relationship in the GRB light curves to high energies. We confirm that GRB
durations decline with energy as a power law with index approximately -0.4, as
was found earlier with the BATSE data and we also notice evidence of a possible
cutoff or break at higher energies. Finally, we performed time-integrated
spectral analysis of all 52 bursts and compared their spectral parameters with
those obtained with the larger data sample of the BATSE data. We find that the
two parameter data sets are similar and confirm that short GRBs are in general
harder than longer ones.Comment: 40 pages, 11 figures, 3 tables, Submitted to Ap
CLO: The cell line ontology
Abstract
Background
Cell lines have been widely used in biomedical research. The community-based Cell Line Ontology (CLO) is a member of the OBO Foundry library that covers the domain of cell lines. Since its publication two years ago, significant updates have been made, including new groups joining the CLO consortium, new cell line cells, upper level alignment with the Cell Ontology (CL) and the Ontology for Biomedical Investigation, and logical extensions.
Construction and content
Collaboration among the CLO, CL, and OBI has established consensus definitions of cell line-specific terms such as ‘cell line’, ‘cell line cell’, ‘cell line culturing’, and ‘mortal’ vs. ‘immortal cell line cell’. A cell line is a genetically stable cultured cell population that contains individual cell line cells. The hierarchical structure of the CLO is built based on the hierarchy of the in vivo cell types defined in CL and tissue types (from which cell line cells are derived) defined in the UBERON cross-species anatomy ontology. The new hierarchical structure makes it easier to browse, query, and perform automated classification. We have recently added classes representing more than 2,000 cell line cells from the RIKEN BRC Cell Bank to CLO. Overall, the CLO now contains ~38,000 classes of specific cell line cells derived from over 200 in vivo cell types from various organisms.
Utility and discussion
The CLO has been applied to different biomedical research studies. Example case studies include annotation and analysis of EBI ArrayExpress data, bioassays, and host-vaccine/pathogen interaction. CLO’s utility goes beyond a catalogue of cell line types. The alignment of the CLO with related ontologies combined with the use of ontological reasoners will support sophisticated inferencing to advance translational informatics development.http://deepblue.lib.umich.edu/bitstream/2027.42/109554/1/13326_2013_Article_185.pd
Search for Top Squark Pair Production in the Dielectron Channel
This report describes the first search for top squark pair production in the
channel stop_1 stopbar_1 -> b bbar chargino_1 chargino_1 -> ee+jets+MEt using
74.9 +- 8.9 pb^-1 of data collected using the D0 detector. A 95% confidence
level upper limit on sigma*B is presented. The limit is above the theoretical
expectation for sigma*B for this process, but does show the sensitivity of the
current D0 data set to a particular topology for new physics.Comment: Five pages, including three figures, submitted to PRD Brief Report
Second Generation Leptoquark Search in p\bar{p} Collisions at = 1.8 TeV
We report on a search for second generation leptoquarks with the D\O\
detector at the Fermilab Tevatron collider at = 1.8 TeV.
This search is based on 12.7 pb of data. Second generation leptoquarks
are assumed to be produced in pairs and to decay into a muon and quark with
branching ratio or to neutrino and quark with branching ratio
. We obtain cross section times branching ratio limits as a function
of leptoquark mass and set a lower limit on the leptoquark mass of 111
GeV/c for and 89 GeV/c for at the 95%\
confidence level.Comment: 18 pages, FERMILAB-PUB-95/185-
- …
