2,645 research outputs found
Shallow landsliding and catchment connectivity within the Houpoto Forest, New Zealand.
Active landslides and their contribution to catchment connectivity have been investigated within the Houpoto Forest, North Island, New Zealand. The aim was to quantify the proportion of buffered versus coupled landslides and explore how specific physical conditions influenced differences in landslide connectivity. Landsliding and land use changes between 2007 and 2010 were identified and mapped from aerial photography, and the preliminary analyses and interpretations of these data are presented here. The data indicate that forest harvesting made some slopes more susceptible to failure, and consequently many landslides were triggered during subsequent heavy rainfall events. Failures were particularly widespread during two high magnitude (> 200 mm/day) rainfall events, as recorded in 2010 imagery. Connectivity was analysed by quantifying the relative areal extents of coupled and buffered landslides identified in the different images. Approximately 10 % of the landslides were identified as being coupled to the local stream network, and thus directly contributing to the sediment budget. Following liberation of landslides during high-magnitude events, low-magnitude events are thought to be capable of transferring more of this sediment to the channel. Subsequent re-planting of the slopes appears to have helped recovery by increasing the thresholds for failure, thus reducing the number of landslides during subsequent high-magnitude rainfall events. Associated with this is a reduction in slope-channel connectivity. These preliminary results highlight how site specific preconditioning, preparatory and triggering factors contribute to landslide distribution and connectivity, in addition to how efficient re-afforestation improves the rate of slope recovery
Data expansion with Huffman codes
The following topics were dealt with: Shannon theory; universal lossless source coding; CDMA; turbo codes; broadband networks and protocols; signal processing and coding; coded modulation; information theory and applications; universal lossy source coding; algebraic geometry codes; modelling analysis and stability in networks; trellis structures and trellis decoding; channel capacity; recording channels; fading channels; convolutional codes; neural networks and learning; estimation; Gaussian channels; rate distortion theory; constrained channels; 2D channel coding; nonparametric estimation and classification; data compression; synchronisation and interference in communication systems; cyclic codes; signal detection; group codes; multiuser systems; entropy and noiseless source coding; dispersive channels and equalisation; block codes; cryptography; image processing; quantisation; random processes; wavelets; sequences for synchronisation; iterative decoding; optical communications
Quantifying Stellar Mass Loss with High Angular Resolution Imaging
Mass is constantly being recycled in the universe. One of the most powerful
recycling paths is via stellar mass-loss. All stars exhibit mass loss with
rates ranging from ~10(-14) to 10(-4) M(sun) yr-1, depending on spectral type,
luminosity class, rotation rate, companion proximity, and evolutionary stage.
The first generation of stars consisted mostly of hydrogen and helium. These
shed material - via massive winds, planetary nebulae and supernova explosions -
seeding the interstellar medium with heavier elements. Subsequent generations
of stars incorporated this material, changing how stars burn and providing
material for planet formation. An understanding of mass loss is critical for
modeling individual stars as well as answering larger astrophysical questions.
Understanding mass loss is essential for following the evolution of single
stars, binaries, star clusters, and galaxies. Mass loss is one of our weakest
areas in the modeling of fundamental stellar processes. In large part this is
owing to lack of confrontation with detailed observations of stellar
photospheres and the mass-loss process. High resolution optical imagery with
telescope arrays is beginning to provide these data and, combined with
spectroscopy and broad infrared and sub-mm coverage, supporting more
sophisticated models on fast computers and promising a new era in mass-loss
studies.Comment: Science white paper prepared for Astro201
Brain Neprilysin Activity and Susceptibility to Transgene-Induced Alzheimer Amyloids
Neprilysin (NEP) is a zinc metalloproteinase that degrades enkephalins, endothelins, and the Alzheimer’s disease amyloid ß (Aß) peptides. NEP-deficient mice possess increased levels of brain Aß1-40 and Aß1-42. The objective of this study was to determine whether tissue NEP specific activity differs according to age and/or across mouse strains, especially those strains predisposed toward formation of Aß-amyloid plaques following overexpression of the human Alzheimer amyloid precursor protein (APP). The C57Bl/6J mouse strain appears to be relatively susceptible to cerebral amyloidosis, whereas the Swiss Webster (SW) strain appears more resistant. We investigated whether NEP specific activity in brain and kidney homogenates from SW and C57 mice of 6, 40, and 80 weeks old varied according to mouse strain, age, and gender. Among the variables tested, NEP specific activity varied most dramatically across mouse strain, with the kidney and brain of SW mice displaying the highest activities. Aging was associated with a reduction in brain NEP specific activity in both trains. Gender-specific differences were identified in kidney but not in brain. We conclude that aging- and strain-dependent ifferences in NEP specific activity may play a role in the differential susceptibility of some mouse strains for developing cerebral amyloidosis following human APP overexpression
Mandated data archiving greatly improves access to research data
The data underlying scientific papers should be accessible to researchers
both now and in the future, but how best can we ensure that these data are
available? Here we examine the effectiveness of four approaches to data
archiving: no stated archiving policy, recommending (but not requiring)
archiving, and two versions of mandating data deposition at acceptance. We
control for differences between data types by trying to obtain data from papers
that use a single, widespread population genetic analysis, STRUCTURE. At one
extreme, we found that mandated data archiving policies that require the
inclusion of a data availability statement in the manuscript improve the odds
of finding the data online almost a thousand-fold compared to having no policy.
However, archiving rates at journals with less stringent policies were only
very slightly higher than those with no policy at all. At one extreme, we found
that mandated data archiving policies that require the inclusion of a data
availability statement in the manuscript improve the odds of finding the data
online almost a thousand fold compared to having no policy. However, archiving
rates at journals with less stringent policies were only very slightly higher
than those with no policy at all. We also assessed the effectiveness of asking
for data directly from authors and obtained over half of the requested
datasets, albeit with about 8 days delay and some disagreement with authors.
Given the long term benefits of data accessibility to the academic community,
we believe that journal based mandatory data archiving policies and mandatory
data availability statements should be more widely adopted
Cross-sectional evaluation of a longitudinal consultation skills course at a new UK medical school
Background: Good communication is a crucial element of good clinical care, and it is important to provide appropriate consultation skills teaching in undergraduate medical training to ensure that doctors have the necessary skills to communicate effectively with patients and other key stakeholders. This article aims to provide research evidence of the acceptability of a longitudinal consultation skills strand in an undergraduate medical course, as assessed by a cross-sectional evaluation of students' perceptions of their teaching and learning experiences. Methods: A structured questionnaire was used to collect student views. The questionnaire comprised two parts: 16 closed questions to evaluate content and process of teaching and 5 open-ended questions. Questionnaires were completed at the end of each consultation skills session across all year groups during the 2006-7 academic year (5 sessions in Year 1, 3 in Year 2, 3 in Year 3, 10 in Year 4 and 10 in Year 5). 2519 questionnaires were returned in total. Results: Students rated Tutor Facilitation most favourably, followed by Teaching, then Practice & Feedback, with suitability of the Rooms being most poorly rated. All years listed the following as important aspects they had learnt during the session: • how to structure the consultation • importance of patient-centredness • aspects of professionalism (including recognising own limits, being prepared, generally acting professionally). All years also noted that the sessions had increased their confidence, particularly through practice. Conclusions: Our results suggest that a longitudinal and integrated approach to teaching consultation skills using a well structured model such as Calgary-Cambridge, facilitates and consolidates learning of desired process skills, increases student confidence, encourages integration of process and content, and reinforces appreciation of patient-centredness and professionalism
Effects of real versus phantom stock option plans on shareholder wealth
Helicobacter pylori causes chronic gastritis and avoids elimination by the immune system of the infected host. The commensal bacterium Lactobacillus acidophilus has been suggested to exert beneficial effects as a supplement during H. pylori eradication therapy. In the present study, we applied whole-genome microarray analysis to compare the immune responses induced in murine bone marrow-derived macrophages (BMDMs) stimulated with L. acidophilus, H. pylori, or both bacteria in combination. While L. acidophilus induced a Th1-polarizing response characterized by high expression of interferon beta (IFN-β) and interleukin 12 (IL-12), H. pylori strongly induced the innate cytokines IL-1β and IL-1α. In BMDMs prestimulated with L. acidophilus, H. pylori blocked the expression of L. acidophilus-induced IFN-β and IL-12 and suppressed the expression of key regulators of the Rho, Rac, and Cdc42 GTPases. The inhibition of L. acidophilus-induced IFN-β was independent of H. pylori viability and the virulence factor CagPAI; however, a vacuolating cytotoxin (vacA) mutant was unable to block IFN-β. Confocal microscopy demonstrated that the addition of H. pylori to L. acidophilus-stimulated BMDMs redirects intracellular processing, leading to an accumulation of L. acidophilus in the endosomal and lysosomal compartments. Thus, our findings indicate that H. pylori inhibits the development of a strong Th1-polarizing response in BMDMs stimulated with L. acidophilus by blocking the production of IFN-β in a VacA-dependent manner. We suggest that this abrogation is caused by a redirection of the endocytotic pathway in the processing of L. acidophilus. IMPORTANCE Approximately half of the world's population is infected with Helicobacter pylori. The factors that allow this pathogen to persist in the stomach and cause chronic infections have not yet been fully elucidated. In particular, how H. pylori avoids killing by macrophages, one of the main types of immune cell underlying the epithelium, remains elusive. Here we have shown that the H. pylori virulence factor VacA plays a key role by blocking the activation of innate cytokines induced by the probiotic Lactobacillus acidophilus in macrophages and suppresses the expression of key regulators required for the organization and dynamics of the intracellular cytoskeleton. Our results identify potential targets for the treatment of H. pylori infection and vaccination, since specific inhibition of the toxin VacA possibly allows the activation of an efficient immune response and thereby eradication of H. pylori in the host
A Mixed-Method Evaluation of a College Student Fitness Program Using the RE-AIM Framework
Background: The consistently rising obesity rate in college student population illustrates the need for organized and effective interventions. The purposes of this study were to evaluate an eight-week fitness program implemented at university student recreation center using mixed-methods along the reach, effectiveness, and implementation dimensions of the RE-AIM framework for evaluating health-promotion programs and to illustrate how qualitative data can be used to enhance the capabilities of the RE-AIM framework to evaluate such programs via providing recommendations to improve the intervention not possible with just a quantitative RE-AIM evaluation. Methods: Quantitative (participation rate, changes in % body fat, and resting heart rate) and qualitative methods (focus groups, interviews, and surveys) were used in the study. Participants in the evaluation were program users. Results: The program reach(1.5/100) and effectiveness(8.5/100) were low, with moderate implementation on the individual level (45.5/100) and high implementation on the organizational level (79/100). Major qualitative themes illustrated that the program‟s strong points were in facilitating physique improvements(n= 11), increasing knowledge(n= 10) and motivation(n= 7) and program shortcomings were primarily due to the quality of personal training (n= 52) and the program dietician services (n= 14). Implications: Such programs often suffer from diminished effectiveness when delivered in the real world,as evident in the present study. The results of the study evaluation can help in the development of effective health promotion programs for the college student population. Suggestions for practice via the RE-AIM framework in conjunction with qualitative analyses are included
- …
