550 research outputs found

    Bounds and Inequalities Relating h-Index, g-Index, e-Index and Generalized Impact Factor

    Get PDF
    Finding relationships among different indices such as h-index, g-index, e-index, and generalized impact factor is a challenging task. In this paper, we describe some bounds and inequalities relating h-index, g-index, e-index, and generalized impact factor. We derive the bounds and inequalities relating these indexing parameters from their basic definitions and without assuming any continuous model to be followed by any of them.Comment: 17 pages, 6 figures, 5 table

    Ten Simple Rules for Getting Help from Online Scientific Communities

    Get PDF
    The increasing complexity of research requires scientists to work at the intersection of multiple fields and to face problems for which their formal education has not prepared them. For example, biologists with no or little background in programming are now often using complex scripts to handle the results from their experiments; vice versa, programmers wishing to enter the world of bioinformatics must know about biochemistry, genetics, and other fields. In this context, communication tools such as mailing lists, web forums, and online communities acquire increasing importance. These tools permit scientists to quickly contact people skilled in a specialized field. A question posed properly to the right online scientific community can help in solving difficult problems, often faster than screening literature or writing to publication authors. The growth of active online scientific communities, such as those listed in Table S1, demonstrates how these tools are becoming an important source of support for an increasing number of researchers. Nevertheless, making proper use of these resources is not easy. Adhering to the social norms of World Wide Web communication—loosely termed “netiquette”—is both important and non-trivial. In this article, we take inspiration from our experience on Internet-shared scientific knowledge, and from similar documents such as “Asking the Questions the Smart Way” and “Getting Answers”, to provide guidelines and suggestions on how to use online communities to solve scientific problems

    Randomised controlled trial of video clips and interactive games to improve vision in children with amblyopia using the I-BiT system

    Get PDF
    Background Traditional treatment of amblyopia involves either wearing a patch or atropine penalisation of the better eye. A new treatment is being developed on the basis of virtual reality technology allowing either DVD footage or computer games which present a common background to both eyes and the foreground, containing the imagery of interest, only to the amblyopic eye. Methods A randomised control trial was performed on patients with amblyopia aged 4–8 years with three arms. All three arms had dichoptic stimulation using shutter glass technology. One arm had DVD footage shown to the amblyopic eye and common background to both, the second used a modified shooter game, Nux, with sprite and targets presented to the amblyopic eye (and background to both) while the third arm had both background and foreground presented to both eyes (non-interactive binocular treatment (non-I-BiT) games). Results Seventy-five patients were randomised; 67 were residual amblyopes and 70 had an associated strabismus. The visual acuity improved in all three arms by approximately 0.07 logMAR in the amblyopic eye at 6 weeks. There was no difference between I-BiT DVD and non-I-BiT games compared with I-BiT games (stated primary outcome) in terms of gain in vision. Conclusions There was a modest vision improvement in all three arms. Treatment was well tolerated and safe. There was no difference between the three treatments in terms of primary stated outcomes but treatment duration was short and the high proportion of previously treated amblyopia and strabismic amblyopia disadvantaged dichoptic stimulation treatment

    Integration of a nationally procured electronic health record system into user work practices

    Get PDF
    BACKGROUND: Evidence suggests that many small- and medium-scale Electronic Health Record (EHR) implementations encounter problems, these often stemming from users' difficulties in accommodating the new technology into their work practices. There is the possibility that these challenges may be exacerbated in the context of the larger-scale, more standardised, implementation strategies now being pursued as part of major national modernisation initiatives. We sought to understand how England's centrally procured and delivered EHR software was integrated within the work practices of users in selected secondary and specialist care settings. METHODS: We conducted a qualitative longitudinal case study-based investigation drawing on sociotechnical theory in three purposefully selected sites implementing early functionality of a nationally procured EHR system. The complete dataset comprised semi-structured interview data from a total of 66 different participants, 38.5 hours of non-participant observation of use of the software in context, accompanying researcher field notes, and hospital documents (including project initiation and lessons learnt reports). Transcribed data were analysed thematically using a combination of deductive and inductive approaches, and drawing on NVivo8 software to facilitate coding. RESULTS: The nationally led "top-down" implementation and the associated focus on interoperability limited the opportunity to customise software to local needs. Lack of system usability led users to employ a range of workarounds unanticipated by management to compensate for the perceived shortcomings of the system. These had a number of knock-on effects relating to the nature of collaborative work, patterns of communication, the timeliness and availability of records (including paper) and the ability for hospital management to monitor organisational performance. CONCLUSIONS: This work has highlighted the importance of addressing potentially adverse unintended consequences of workarounds associated with the introduction of EHRs. This can be achieved with customisation, which is inevitably somewhat restricted in the context of attempts to implement national solutions. The tensions and potential trade-offs between achieving large-scale interoperability and local requirements is likely to be the subject of continuous debate in England and beyond with no easy answers in sight

    X-ray emission from isolated neutron stars

    Full text link
    X-ray emission is a common feature of all varieties of isolated neutron stars (INS) and, thanks to the advent of sensitive instruments with good spectroscopic, timing, and imaging capabilities, X-ray observations have become an essential tool in the study of these objects. Non-thermal X-rays from young, energetic radio pulsars have been detected since the beginning of X-ray astronomy, and the long-sought thermal emission from cooling neutron star's surfaces can now be studied in detail in many pulsars spanning different ages, magnetic fields, and, possibly, surface compositions. In addition, other different manifestations of INS have been discovered with X-ray observations. These new classes of high-energy sources, comprising the nearby X-ray Dim Isolated Neutron Stars, the Central Compact Objects in supernova remnants, the Anomalous X-ray Pulsars, and the Soft Gamma-ray Repeaters, now add up to several tens of confirmed members, plus many candidates, and allow us to study a variety of phenomena unobservable in "standard'' radio pulsars.Comment: Chapter to be published in the book of proceedings of the 1st Sant Cugat Forum on Astrophysics, "ICREA Workshop on the high-energy emission from pulsars and their systems", held in April, 201

    Improving biomass production and saccharification in Brachypodium distachyon through overexpression of a sucrose-phosphate synthase from sugarcane

    Get PDF
    The substitution of fossil by renewable energy sources is a major strategy in reducing CO2 emission and mitigating climate change. In the transport sector, which is still mainly dependent on liquid fuels, the production of second generation ethanol from lignocellulosic feedstock is a promising strategy to substitute fossil fuels. The main prerequisites on designated crops for increased biomass production are high biomass yield and optimized saccharification for subsequent use in fermentation processes. We tried to address these traits by the overexpression of a sucrose-phosphate synthase gene (SoSPS) from sugarcane (Saccharum officinarum) in the model grass Brachypodium distachyon. The resulting transgenic B. distachyon lines not only revealed increased plant height at early growth stages but also higher biomass yield from fully senesced plants, which was increased up to 52 % compared to wild-type. Additionally, we determined higher sucrose content in senesced leaf biomass from the transgenic lines, which correlated with improved biomass saccharification after conventional thermo-chemical pretreatment and enzymatic hydrolysis. Combining increased biomass production and saccharification efficiency in the generated B. distachyon SoSPS overexpression lines, we obtained a maximum of 74 % increase in glucose release per plant compared to wild-type. Therefore, we consider SoSPS overexpression as a promising approach in molecular breeding of energy crops for optimizing yields of biomass and its utilization in second generation biofuel production

    Community assessment to advance computational prediction of cancer drug combinations in a pharmacogenomic screen

    Get PDF
    The effectiveness of most cancer targeted therapies is short-lived. Tumors often develop resistance that might be overcome with drug combinations. However, the number of possible combinations is vast, necessitating data-driven approaches to find optimal patient-specific treatments. Here we report AstraZeneca's large drug combination dataset, consisting of 11,576 experiments from 910 combinations across 85 molecularly characterized cancer cell lines, and results of a DREAM Challenge to evaluate computational strategies for predicting synergistic drug pairs and biomarkers. 160 teams participated to provide a comprehensive methodological development and benchmarking. Winning methods incorporate prior knowledge of drug-target interactions. Synergy is predicted with an accuracy matching biological replicates for >60% of combinations. However, 20% of drug combinations are poorly predicted by all methods. Genomic rationale for synergy predictions are identified, including ADAM17 inhibitor antagonism when combined with PIK3CB/D inhibition contrasting to synergy when combined with other PI3K-pathway inhibitors in PIK3CA mutant cells

    Cervical lymph node metastasis in adenoid cystic carcinoma of the larynx: a collective international review

    Get PDF
    Adenoid cystic carcinoma (AdCC) of the head and neck is a well-recognized pathologic entity that rarely occurs in the larynx. Although the 5-year locoregional control rates are high, distant metastasis has a tendency to appear more than 5 years post treatment. Because AdCC of the larynx is uncommon, it is difficult to standardize a treatment protocol. One of the controversial points is the decision whether or not to perform an elective neck dissection on these patients. Because there is contradictory information about this issue, we have critically reviewed the literature from 1912 to 2015 on all reported cases of AdCC of the larynx in order to clarify this issue. During the most recent period of our review (1991-2015) with a more exact diagnosis of the tumor histology, 142 cases were observed of AdCC of the larynx, of which 91 patients had data pertaining to lymph node status. Eleven of the 91 patients (12.1%) had nodal metastasis and, based on this low proportion of patients, routine elective neck dissection is therefore not recommended

    The Sea State CCI dataset v1: towards a sea state climate data record based on satellite observations

    Get PDF
    Sea state data are of major importance for climate studies, marine engineering, safety at sea and coastal management. However, long-term sea state datasets are sparse and not always consistent, and sea state data users still mostly rely on numerical wave models for research and engineering applications. Facing the urgent need for a sea state climate data record, the Global Climate Observing System has listed “Sea State” as an Essential Climate Variable (ECV), fostering the launch in 2018 of the Sea State Climate Change Initiative (CCI). The CCI is a programme of the European Space Agency, whose objective is to realise the full potential of global Earth observation archives established by ESA and its member states in order to contribute to the ECV database. This paper presents the implementation of the first release of the Sea State CCI dataset, the implementation and benefits of a high-level denoising method, its validation against in situ measurements and numerical model outputs, and the future developments considered within the Sea State CCI project. The Sea State CCI dataset v1 is freely available on the ESA CCI website (http://cci.esa.int/data, last access: 25 August 2020) at ftp://anon-ftp.ceda.ac.uk/neodc/esacci/sea_state/data/v1.1_release/ (last access: 25 August 2020). Three products are available: a multi-mission along-track L2P product (http://dx.doi.org/ 10.5285/f91cd3ee7b6243d5b7d41b9beaf397e1, Piollé et al., 2020a), a daily merged multi mission along-track L3 product (http://dx.doi.org/10.5285/3ef6a5a66e9947d39b356251909dc12b, Piollé et al., 2020b) and a multimission monthly gridded L4 product (http://dx.doi.org/10.5285/47140d618dcc40309e1edbca7e773478, Piollé et al., 2020c)

    A Pilot Study on Automatic Three-Dimensional Quantification of Barrett’s Esophagus for Risk Stratification and Therapy Monitoring

    Get PDF
    Background & Aims Barrett’s epithelium measurement using widely accepted Prague C&M classification is highly operator dependent. We propose a novel methodology for measuring this risk score automatically. The method also enables quantification of the area of Barrett’s epithelium (BEA) and islands, which was not possible before. Furthermore, it allows 3-dimensional (3D) reconstruction of the esophageal surface, enabling interactive 3D visualization. We aimed to assess the accuracy of the proposed artificial intelligence system on both phantom and endoscopic patient data. Methods Using advanced deep learning, a depth estimator network is used to predict endoscope camera distance from the gastric folds. By segmenting BEA and gastroesophageal junction and projecting them to the estimated mm distances, we measure C&M scores including the BEA. The derived endoscopy artificial intelligence system was tested on a purpose-built 3D printed esophagus phantom with varying BEAs and on 194 high-definition videos from 131 patients with C&M values scored by expert endoscopists. Results Endoscopic phantom video data demonstrated a 97.2% accuracy with a marginal ± 0.9 mm average deviation for C&M and island measurements, while for BEA we achieved 98.4% accuracy with only ±0.4 cm2 average deviation compared with ground-truth. On patient data, the C&M measurements provided by our system concurred with expert scores with marginal overall relative error (mean difference) of 8% (3.6 mm) and 7% (2.8 mm) for C and M scores, respectively. Conclusions The proposed methodology automatically extracts Prague C&M scores with high accuracy. Quantification and 3D reconstruction of the entire Barrett’s area provides new opportunities for risk stratification and assessment of therapy response
    corecore