1,378 research outputs found
The poverty of journal publishing
The article opens with a critical analysis of the dominant business model of for-profit, academic publishing, arguing that the extraordinarily high profits of the big publishers are dependent upon a double appropriation that exploits both academic labour and universities’ financial resources. Against this model, we outline four possible responses: the further development of open access repositories, a fair trade model of publishing regulation, a renaissance of the university presses, and, finally, a move away from private, for-profit publishing companies toward autonomous journal publishing by editorial boards and academic associations. </jats:p
Where should I publish? A library handout for researchers
5 warning signs of a predatory journal
7 essential questions to ask when evaluating a journal
Checklist to determine whether a journal is reputable
Tools to find journals based on various selection criteria
By reviewing and applying the dos and don’ts within these pages, you will increase the likelihood of publishing in the right journal for your work. Identifying the right title isn’t easy. In addition to using this guide, consider booking a consultation with a librarian to help you identify and apply your selection criteria
Industrial strategy and the UK regions: Sectorally narrow and spatially blind
The UK government's new Industrial Strategy could have a significant impact on the country's regions and localities. However, this has received little attention to date. The analysis presented here examines the existing location of the sectors targeted by the first phase of the Industrial Strategy Challenge Fund and the location of the R&D laboratories likely to be first in line for funding. In focusing on an extremely narrow range of sectors, the Fund is likely to have limited impact on the UK's persistent regional inequalities. The activities eligible for support account for relatively little of manufacturing or the rest of the economy and the basis of this targeting and its potential distributional consequences are spatially blind. As such, it runs the risk of widening regional divides in prosperity
The Accuracy of Confidence Intervals for Field Normalised Indicators
This is an accepted manuscript of an article published by Elsevier in Journal of Informetrics on 07/04/2017, available online: https://doi.org/10.1016/j.joi.2017.03.004
The accepted version of the publication may differ from the final published version.When comparing the average citation impact of research groups, universities and countries, field normalisation reduces the influence of discipline and time. Confidence intervals for these indicators can help with attempts to infer whether differences between sets of publications are due to chance factors. Although both bootstrapping and formulae have been proposed for these, their accuracy is unknown. In response, this article uses simulated data to systematically compare the accuracy of confidence limits in the simplest possible case, a single field and year. The results suggest that the MNLCS (Mean Normalised Log-transformed Citation Score) confidence interval formula is conservative for large groups but almost always safe, whereas bootstrap MNLCS confidence intervals tend to be accurate but can be unsafe for smaller world or group sample sizes. In contrast, bootstrap MNCS (Mean Normalised Citation Score) confidence intervals can be very unsafe, although their accuracy increases with sample sizes
Oxidative addition of organic halides to zerovalent palladium complexes containing rigid bidentate nitrogen ligands
Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals
Microsoft Academic is a free citation index that allows large scale data collection. This combination makes it useful for scientometric research. Previous studies have found that its citation counts tend to be slightly larger than those of Scopus but smaller than Google Scholar, with disciplinary variations. This study reports the largest and most systematic analysis so far, of 172,752 articles in 29 large journals chosen from different specialisms. From Scopus citation counts, Microsoft Academic citation counts and Mendeley reader counts for articles published 2007-2017, Microsoft Academic found a slightly more (6%) citations than Scopus overall and especially for the current year (51%). It found fewer citations than Mendeley readers overall (59%), and only 7% as many for the current year. Differences between journals were probably due to field preprint sharing cultures or journal policies rather than broad disciplinary differences
- …
