471 research outputs found
Recommended from our members
An investigation of whether factors associated with short-term attrition change or persist over ten years: data from the Medical Research Council Cognitive Function and Ageing Study (MRC CFAS).
BACKGROUND: Factors associated with the loss of participants in long-term longitudinal studies of ageing, due to refusal or moves, have been discussed less than those with short term follow-up. METHODS: In a population-based study of cognition and ageing (the Medical Research Council Cognitive Function and Ageing Study (MRC CFAS)), factors associated with dropout due to refusal and moving in the first follow-up period (over two years) are compared with factors associated with dropout over ten years. Participants at 10-year follow-up are compared with their age-standardised baseline contemporaries. RESULTS: Some consistent trends are found over the longer term. Refusers tended to have poorer cognition, less years of education, not have a family history of dementia and be women. Characteristics of people who moved differed between waves, but the oldest and people in worse health moved more. When surviving and responding individuals at ten years are compared with those of the same age at baseline many differences are found. Individuals of lower social class, education, cognitive ability, in residential care, with sight/hearing problems and poor/fair self-reported health are less likely to be seen after 10 years of follow-up. Individuals report more health problems when they participate in multiple interviews. CONCLUSION: The characteristics of refusers in the longer term are similar to those refusing to participate over the shorter term. Long-term follow-up studies will under represent the disadvantaged and disabled but represent full health status of participating individuals better. There are advantages and disadvantages to both short-term and long-term follow-up.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are
The microbiology of impetigo in Indigenous children: associations between Streptococcus pyogenes, Staphylococcus aureus, scabies, and nasal carriage
BackgroundImpetigo is caused by both Streptococcus pyogenes and Staphylococcus aureus; the relative contributions of each have been reported to fluctuate with time and region. While S. aureus is reportedly on the increase in most industrialised settings, S. pyogenes is still thought to drive impetigo in endemic, tropical regions. However, few studies have utilised high quality microbiological culture methods to confirm this assumption. We report the prevalence and antimicrobial resistance of impetigo pathogens recovered in a randomised, controlled trial of impetigo treatment conducted in remote Indigenous communities of northern Australia.MethodsEach child had one or two sores, and the anterior nares, swabbed. All swabs were transported in skim milk tryptone glucose glycogen broth and frozen at –70°C, until plated on horse blood agar. S. aureus and S. pyogenes were confirmed with latex agglutination.ResultsFrom 508 children, we collected 872 swabs of sores and 504 swabs from the anterior nares prior to commencement of antibiotic therapy. S. pyogenes and S. aureus were identified together in 503/872 (58%) of sores; with an additional 207/872 (24%) sores having S. pyogenes and 81/872 (9%) S. aureus, in isolation. Skin sore swabs taken during episodes with a concurrent diagnosis of scabies were more likely to culture S. pyogenes (OR 2.2, 95% CI 1.1 – 4.4, p = 0.03). Eighteen percent of children had nasal carriage of skin pathogens. There was no association between the presence of S. aureus in the nose and skin. Methicillin-resistance was detected in 15% of children who cultured S. aureus from either a sore or their nose. There was no association found between the severity of impetigo and the detection of a skin pathogen.ConclusionsS. pyogenes remains the principal pathogen in tropical impetigo; the relatively high contribution of S. aureus as a co-pathogen has also been confirmed. Children with scabies were more likely to have S. pyogenes detected. While clearance of S. pyogenes is the key determinant of treatment efficacy, co-infection with S. aureus warrants consideration of treatment options that are effective against both pathogens where impetigo is severe and prevalent
The reliability of assigning individuals to cognitive states using the Mini Mental-State Examination: a population-based prospective cohort study.
BACKGROUND: Previous investigations of test re-test reliability of the Mini-Mental State Examination (MMSE) have used correlations and statistics such as Cronbach's α to assess consistency. In practice, the MMSE is usually used to group individuals into cognitive states. The reliability of this grouping (state based approach) has not been fully explored. METHODS: MMSE data were collected on a subset of 2,275 older participants (≥ 65 years) from the population-based Medical Research Council Cognitive Function and Ageing Study. Two measurements taken approximately two months apart were used to investigate three state-based categorisations. Descriptive statistics were used to determine how many people remained in the same cognitive group or went up or down groups. Weighted logistic regression was used to identify predictive characteristics of those who moved group. RESULTS: The proportion of people who remained in the same MMSE group at screen and follow-up assessment ranged from 58% to 78%. The proportion of individuals who went up one or more groups was roughly equal to the proportion that went down one or more groups; most of the change occurred when measurements were close to the cut-points. There was no consistently significant predictor for changing cognitive group. CONCLUSION: A state-based approach to analysing the reliability of the MMSE provided similar results to correlation analyses. State-based models of cognitive change or individual trajectory models using raw scores need multiple waves to help overcome natural variation in MMSE scores and to help identify true cognitive change.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are
Iron status is inversely associated with dietary iron intakes in patients with inactive or mildly active inflammatory bowel disease.
BACKGROUND: Patients with inflammatory bowel disease (IBD) frequently appear iron deplete but whether this is a reflection of dietary iron intakes is not known. METHODS: Dietary data were collected from 29 patients with inactive or mildly-active IBD and 28 healthy controls using a validated food frequency questionnaire that measured intakes of iron and its absorption modifiers. Non-haem iron availability was estimated using a recently developed algorithm. Subjects were classified for iron status based upon data from a concomitant and separately published study of iron absorption. Absorption was used to define iron status because haematological parameters are flawed in assessing iron status in inflammatory conditions such as IBD. RESULTS: Dietary intakes of total iron, non-haem iron and vitamin C were significantly greater in IBD patients who were iron replete compared to those who were iron deplete (by 48%, 48% and 94% respectively; p≤0.05). The predicted percentage of available non-haem iron did not differ between these groups (19.7 ± 2.0% vs 19.3 ± 2.0% respectively; p=0.25). However, because of the difference in iron intake, the overall amount of absorbed iron did (2.4 ± 0.8 mg/d vs 1.7 ± 0.5 mg/d; p=0.013). No such differences were observed in the healthy control subjects. CONCLUSIONS: In IBD, iron status is more closely related to the quality and quantity of dietary iron intake than in the general healthy population.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are
Dietary fortificant iron intake is negatively associated with quality of life in patients with mildly active inflammatory bowel disease.
BACKGROUND: Iron deficiency anaemia and oral iron supplementation have been associated negatively with quality of life, and with adverse effects, respectively, in subjects with inflammatory bowel disease (IBD). Hence, the risk-benefit ratio of oral iron is not understood in this patient group. The present case-control study investigated whether dietary iron intake impacts on quality of life in IBD patients. METHODS: Quality of life, habitual dietary iron intakes and iron requirements were assessed in 29 patients with inactive or mildly active IBD as well as in 28 healthy control subjects. RESULTS: As expected, quality of life was worse in IBD patients as a whole in comparison to healthy controls according to EuroQol score and EuroQol VAS percentage (6.9 ± 1.6 vs 5.3 ± 0.6; p< 0.0001 and 77 ± 14% vs 88 ± 12%; p=0.004 respectively). For IBD subjects, 21/29 were iron deplete based upon serum iron responses to oral iron but, overall, were non-anaemic with mean haemoglobin of 13.3 ± 1.5 g/dL, and there was no difference in their quality of life compared to 8/29 iron replete subjects (Hb 14.0 ± 0.8 g/dL). Interestingly, total dietary iron intake was significantly negatively associated with quality of life in IBD patients, specifically for non-haem iron and, more specifically, for fortificant iron. Moreover, for total non-haem iron the negative association disappeared when fortificant iron values were subtracted. Finally, further sub-analysis indicated that the negative association between (fortificant) dietary iron intake and quality of life in IBD patients is driven by findings in patients with mildly active disease rather than in patients with quiescent disease. CONCLUSIONS: Iron deficiency per se (i.e. without concomitant anaemia) does not appear to further affect quality of life in IBD patients with inactive or mildly active disease. However, in this preliminary study, dietary iron intake, particularly fortificant iron, appears to be significantly negatively associated with quality of life in patients with mildly active disease.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are
The color of combat: how subtleties of resource value alter the effects of fighting ability in hermit crab contests
During contests over resources, the intensity of fighting and the outcome is often influenced by an interaction between the difference between opponents in the fighting abilities (resource holding potential [RHP]) and the difference in how they subjectively value the resource. Additionally, because fighting can involve vigorous activity, engaging in a fight can make the opponents more noticeable to predators. Thus, predation risk, alongside resource value (RV), may interact with the effects of RHP difference. During hermit crab contests over the ownership of empty gastropod shells attackers perform vigorous shell rapping, whereas defenders adopt a protective posture inside their shells. This role asymmetry provides an opportunity to decompose the effects of motivation and predation risk on fighting as both will be influenced by the visual contrast of the shells against the substrate. If attackers are sensitive to RV, their agonistic behavior should vary with the color of the defender\u27s shell and if they are sensitive to predation risk, it should be influenced by the color of their own shell, due to the vigorous way it moved during shell rapping. We found that the most vigorous fights, with the greatest chance of victory for attackers, occurred when both shells were of high contrast, indicating that RHP advantage is modulated by both RV and potential predation risk. Moreover, this shows how even relatively subtle features of a contested resource, such as color, and modify the effects of fighting ability during animal contests
Reduced middle ear infection with non-typeable Haemophilus influenzae, but not Streptococcus pneumoniae, after transition to 10-valent pneumococcal non-typeable H. influenzae protein D conjugate vaccine
BackgroundIn October 2009, 7-valent pneumococcal conjugate vaccine (PCV7: PrevenarTM Pfizer) was replaced in the Northern Territory childhood vaccination schedule by 10-valent pneumococcal Haemophilus influenzae protein D conjugate vaccine (PHiD-CV10; Synflorix™ GlaxoSmithKline Vaccines). This analysis aims to determine whether the reduced prevalence of suppurative otitis media measured in the PHiD-CV10 era was associated with changes in nasopharyngeal (NP) carriage and middle ear discharge (ED) microbiology in vaccinated Indigenous children.MethodsSwabs of the NP and ED were collected in remote Indigenous communities between September 2008 and December 2012. Swabs were cultured using standardised methods for otitis media pathogens. Children less than 3 years of age and having received a primary course of 2 or more doses of one PCV formulation and not more than one dose of another PCV formulation were included in the primary analysis; children with non-mixed single formulation PCV schedules were also compared.ResultsNP swabs were obtained from 421 of 444 (95 %) children in the PCV7 group and 443 of 451 (98 %) children in the PHiD-CV10 group. Non-mixed PCV schedules were received by 333 (79 %) and 315 (71 %) children, respectively. Pneumococcal (Spn) NP carriage was 76 % and 82 %, and non-typeable Haemophilus influenzae (NTHi) carriage was 68 % and 73 %, respectively. ED was obtained from 60 children (85 perforations) in the PCV7 group and from 47 children (59 perforations) in the PHiD-CV10 group. Data from bilateral perforations were combined. Spn was cultured from 25 % and 18 %, respectively, and NTHi was cultured from 61 % and 34 % respectively (p = 0.008).ConclusionsThe observed reduction in the prevalence of suppurative OM in this population was not associated with reduced NP carriage of OM pathogens. The prevalence of NTHi-infected ED was lower in PHiD-CV10 vaccinated children compared to PCV7 vaccinated children. Changes in clinical severity may be explained by the action of PHiD-CV10 on NTHi infection in the middle ear. Randomised controlled trials are needed to answer this question
Understanding between-cluster variation in prevalence, and limits for how much variation is plausible
In clinical trials and observational studies of clustered binary data, understanding between-cluster variation is essential: in sample size and power calculations of cluster randomised trials, for example, the intra-cluster correlation coefficient is often specified. However, quantifications of between-cluster variation can be unintuitive, and an intra-cluster correlation coefficient as low as 0.04 may correspond to surprisingly large between-cluster differences. We suggest that understanding is improved through visualising the implied distribution of true cluster prevalences – possibly by assuming they follow a beta distribution – or by calculating their standard deviation, which is more readily interpretable than the intra-cluster correlation coefficient. Even so, the bounded nature of binary data complicates the interpretation of variances as primary measures of uncertainty, and entropy offers an attractive alternative. Appealing to maximum entropy theory, we propose the following rule of thumb: that plausible intra-cluster correlation coefficients and standard deviations of true cluster prevalences are both bounded above by the overall prevalence, its complement, and one third. We also provide corresponding bounds for the coefficient of variation, and for a different standard deviation and intra-cluster correlation defined on the log odds scale. Using previously published data, we observe the quantities defined on the log odds scale to be more transportable between studies with different outcomes with different prevalences than the intra-cluster correlation and coefficient of variation. The latter increase and decrease, respectively, as prevalence increases from 0% to 50%, and the same is true for our bounds. Our work will help clinical trialists better understand between-cluster variation and avoid specifying implausibly high values for the intra-cluster correlation in sample size and power calculations
Spatial patterns of scour and fill in dryland sand bed streams
Reproduced with permission of the publisher. © 2006 American Geophysical UnionSpatial patterns of scour and fill in two dryland ephemeral stream channels with sandy bed material have been measured with dense arrays of scour chains. Although the depth and areal extent of bed activity increased with discharge, active bed reworking at particular locations within the reaches resulted in downstream patterns of alternate shallower and deeper areas of scour. The variation was such that mean scour depths for individual cross sections varied about the mean for the reach by a factor of 2–4 while the locus of maximum scour traced a sinuous path about the channel centerline. The wavelength of the pattern of scour was about seven times the channel width. During each event, compensating fill returned the streambeds to preflow elevations, indicating that the streams were in approximate steady state over the period of study. Although the patterns of periodically enhanced scour along alternate sides of the channels are consistent with models of periodically reversing helical flow, further work is required to identify the causal relationships between patterns of flow and sediment transport in dryland sand bed channels
Subject Benchmark Statement: Forensic Science 2012
QAA subject benchmark statement for Forensic Scienc
- …
