5,012 research outputs found
Examining Facebook practice : the case of New Zealand provincial rugby : a thesis presented in partial fulfilment of the requirements for the degree of Masters in Sport and Exercise at Massey University, Palmerston North, New Zealand
Social media have become a defining feature of 21st century communications. Conceived in 2004 Facebook has risen from relative obscurity to become the most visited website in the world. While social media use has grown exponentially, so too has its influence. Sport organisations were quick to capitalise on Facebook’s popularity particularly with the introduction of brand pages in 2010. The trend is no different particularly in New Zealand Rugby’s (NZR) National Provincial Championship (NPC). However recent research indicates a lack of understanding and consistency in evaluating effectiveness within the context of Facebook. Scholars have further acknowledged a need to move beyond simple metrics as measures of performance.
Using a mixed method approach this case study of four NPC rugby teams investigated the understanding of effective Facebook practice. Thematic analysis of qualitative questionnaires completed by each page’s main administrator explored their understanding of effective Facebook practice. The researcher also utilised an auto-ethnographic journal to document his own experience of managing one of the participating brand pages. Page performance was also investigated through analysis of Facebook insights data to establish how it may be more accurately interpreted to inform best practice.
Results reveal that administrators perceive lack of control, maintaining credibility, guaranteeing reach and resource allocation to be the most prominent challenges faced by these brand pages. Such issues provide further tensions when attempting to justify social media use and effectiveness within sport organisations. Furthermore, teams are faced with commercial obligations to post sponsor content that may negatively impact user engagement. In addition, findings suggest that contrary to popular belief, greater total network sizes do not guarantee greater reach and engagement. It is proposed that teams consider proportional measures of performance when seeking to measure Facebook performance. Holistically the research sets a platform that can be used in future studies to tangibly connect Facebook effectiveness to organisational strategy and objectives
Economic Analysis of Using Soybean Meal as a Mushroom Growing Substrate
Mushrooms have been grown commercially on many different substrates for years, usually agricultural by-products such as straw or stover. Increased popularity for specialty mushrooms with consumers has led to increased production and great demand for economic substrates. Oyster mushrooms are easier to grow relative to other types of mushrooms and their production has increased dramatically in recent years. This study examines the economic feasibility of using soybean hulls as a primary substrate for oyster mushrooms, replacing traditional wheat straw. The study uses a cost-benefit analysis to determine an optimal substrate based on yield and the number of crops harvested per year. The study shows that soybean hulls, combined with corn gluten or soybean meal increases yield 4.5 times, which more than offsets for higher costs for soybean hulls. The use of soybean substrate also allows a producer to raise about four more crops per year, which in turn uses fixed resources more efficiently and increases profitability.Oyster, Mushrooms, Substrate, Soybean, Hulls, Meal, Economic, Feasibility, Crop Production/Industries,
Combining community-based research and local knowledge to confront asthma and subsistence-fishing hazards in Greenpoint/Williamsburg, Brooklyn, New York.
Activists in the environmental justice movement are challenging expert-driven scientific research by taking the research process into their own hands and speaking for themselves by defining, analyzing, and prescribing solutions for the environmental health hazards confronting communities of the poor and people of color. I highlight the work of El Puente and The Watchperson Project--two community-based organizations in the Greenpoint/Williamsburg neighborhood in Brooklyn, New York, that have engaged in community-based participatory research (CBPR) to address asthma and risks from subsistence-fish diets. The CBPR process aims to engage community members as equal partners alongside scientists in problem definition, information collection, and data analysis--all geared toward locally relevant action for social change. In the first case I highlight how El Puente has organized residents to conduct a series of asthma health surveys and tapped into local knowledge of the Latino population to understand potential asthma triggers and to devise culturally relevant health interventions. In a second case I follow The Watchperson Project and their work surveying subsistence anglers and note how the community-gathered information contributed key data inputs for the U.S. Environmental Protection Agency Cumulative Exposure Project in the neighborhood. In each case I review the processes each organization used to conduct CBPR, some of their findings, and the local knowledge they gathered, all of which were crucial for understanding and addressing local environmental health issues. I conclude with some observations about the benefits and limits of CBPR for helping scientists and communities pursue environmental justice
Recommended from our members
CDASH: a cloud-enabled program for structure solution from powder diffraction data
The simulated annealing approach to crystal structure determination from powder diffraction data, as implemented in the DASH program, is readily amenable to parallelization at the individual run level. Very large scale increases in speed of execution can be achieved by distributing individual DASH runs over a network of computers. The CDASH program delivers this by using
scalable on-demand computing clusters built on the Amazon Elastic Compute Cloud service. By way of example, a 360 vCPU cluster returned the crystal structure of racemic ornidazole (Z0 = 3, 30 degrees of freedom) ca 40 times faster than a typical modern quad-core desktop CPU. Whilst used here specifically for DASH, this approach is of general applicability to other packages that are
amenable to coarse-grained parallelism strategies
Quantifying the Drivers of Star Formation on Galactic Scales. I. The Small Magellanic Cloud
We use the star formation history of the Small Magellanic Cloud (SMC) to
place quantitative limits on the effect of tidal interactions and gas infall on
the star formation and chemical enrichment history of the SMC. The coincident
timing of two recent (< 4 Gyr) increases in the star formation rate and
SMC/Milky Way(MW) pericenter passages suggests that global star formation in
the SMC is driven at least in part by tidal forces due to the MW. The Large
Magellanic Cloud (LMC) is the other potential driver of star formation, but is
only near the SMC during the most recent burst. The poorly constrained LMC-SMC
orbit is our principal uncertainty. To explore the correspondence between
bursts and MW pericenter passages further, we model star formation in the SMC
using a combination of continuous and tidally-triggered star formation. The
behavior of the tidally-triggered mode is a strong inverse function of the
SMC-MW separation (preferred behavior ~ r^-5, resulting in a factor of ~100
difference in the rate of tidally-triggered star formation at pericenter and
apocenter). Despite the success of these closed-box evolutionary models in
reproducing the recent SMC star formation history and current chemical
abundance, they have some systematic shortcomings that are remedied by
postulating that a sizable infall event (~ 50% of the total gas mass) occured
about 4 Gyr ago. Regardless of whether this infall event is included, the
fraction of stars in the SMC that formed via a tidally triggered mode is > 10%
and could be as large as 70%.Comment: Accepted for publication in Ap
Kernel density estimation of CSD distributions - an application to knowledge based molecular optimisation
An interleaved sampling scheme for the characterization of single qubit dynamics
In this paper, we demonstrate that interleaved sampling techniques can be
used to characterize the Hamiltonian of a qubit and its environmental
decoherence rate. The technique offers a significant advantage in terms of the
number of measurements that are required to characterize a qubit. When compared
to the standard Nyquist-Shannon sampling rate, the saving in the total
measurement time for the interleaved method is approximately proportional to
the ratio of the sample rates.Comment: 9 pages, 4 figure
Shortwave radiative forcing, rapid adjustment, and feedback to the surface by sulfate geoengineering: analysis of the Geoengineering Model Intercomparison Project G4 scenario
This study evaluates the forcing, rapid adjustment, and feedback of net shortwave radiation at the surface in the G4 experiment of the Geoengineering Model Intercomparison Project by analysing outputs from six participating models. G4 involves injection of 5 Tg yr?1 of SO2, a sulfate aerosol precursor, into the lower stratosphere from year 2020 to 2069 against a background scenario of RCP4.5. A single-layer atmospheric model for shortwave radiative transfer is used to estimate the direct forcing of solar radiation management (SRM), and rapid adjustment and feedbacks from changes in the water vapour amount, cloud amount, and surface albedo (compared with RCP4.5). The analysis shows that the globally and temporally averaged SRM forcing ranges from ?3.6 to ?1.6 W m?2, depending on the model. The sum of the rapid adjustments and feedback effects due to changes in the water vapour and cloud amounts increase the downwelling shortwave radiation at the surface by approximately 0.4 to 1.5 W m?2 and hence weaken the effect of SRM by around 50 %. The surface albedo changes decrease the net shortwave radiation at the surface; it is locally strong (~ ?4 W m?2) in snow and sea ice melting regions, but minor for the global average. The analyses show that the results of the G4 experiment, which simulates sulfate geoengineering, include large inter-model variability both in the direct SRM forcing and the shortwave rapid adjustment from change in the cloud amount, and imply a high uncertainty in modelled processes of sulfate aerosols and clouds
Analysis Method for Jet-Like Three-Particle Azimuthal Correlations
Jet-like three-particle azimuthal correlations can discriminate various
physical scenarios that have been proposed to explain the observed strong
modification to two-particle azimuthal correlations. The three-particle
correlation analysis is notoriously difficult in heavy-ion collisions due to
the large combinatoric backgrounds. We describe the general idea behind the
jet-like three-particle azimuthal correlation analysis, with emphasis put on
the subtraction of the combinatoric backgrounds. We discuss in detail the
various systematic effects in such an analysis.Comment: This is the final published version in NIM
- …
