12,474 research outputs found
Iron metabolism in trypanosomatids, and its crucial role in infection.
Iron is almost ubiquitous in living organisms due to the utility of its redox chemistry. It is also dangerous as it can catalyse the formation of reactive free radicals - a classical double-edged sword. In this review, we examine the uptake and usage of iron by trypanosomatids and discuss how modulation of host iron metabolism plays an important role in the protective response. Trypanosomatids require iron for crucial processes including DNA replication, antioxidant defence, mitochondrial respiration, synthesis of the modified base J and, in African trypanosomes, the alternative oxidase. The source of iron varies between species. Bloodstream-form African trypanosomes acquire iron from their host by uptake of transferrin, and Leishmania amazonensis expresses a ZIP family cation transporter in the plasma membrane. In other trypanosomatids, iron uptake has been poorly characterized. Iron-withholding responses by the host can be a major determinant of disease outcome. Their role in trypanosomatid infections is becoming apparent. For example, the cytosolic sequestration properties of NRAMP1, confer resistance against leishmaniasis. Conversely, cytoplasmic sequestration of iron may be favourable rather than detrimental to Trypanosoma cruzi. The central role of iron in both parasite metabolism and the host response is attracting interest as a possible point of therapeutic intervention
Recommended from our members
Group 2 Innate Lymphoid Cells Are Redundant in Experimental Renal Ischemia-Reperfusion Injury.
Acute kidney injury (AKI) can be fatal and is a well-defined risk factor for the development of chronic kidney disease. Group 2 innate lymphoid cells (ILC2s) are innate producers of type-2 cytokines and are critical regulators of homeostasis in peripheral organs. However, our knowledge of their function in the kidney is relatively limited. Recent evidence suggests that increasing ILC2 numbers by systemic administration of recombinant interleukin (IL)-25 or IL-33 protects against renal injury. Whilst ILC2s can be induced to protect against ischemic- or chemical-induced AKI, the impact of ILC2 deficiency or depletion on the severity of renal injury is unknown. Firstly, the phenotype and location of ILC2s in the kidney was assessed under homeostatic conditions. Kidney ILC2s constitutively expressed high levels of IL-5 and were located in close proximity to the renal vasculature. To test the functional role of ILC2s in the kidney, an experimental model of renal ischemia-reperfusion injury (IRI) was used and the severity of injury was assessed in wild-type, ILC2-reduced, ILC2-deficient, and ILC2-depleted mice. Surprisingly, there were no differences in histopathology, collagen deposition or mRNA expression of injury-associated (Lcn2), inflammatory (Cxcl1, Cxcl2, and Tnf) or extracellular matrix (Col1a1, Fn1) factors following IRI in the absence of ILC2s. These data suggest the absence of ILC2s does not alter the severity of renal injury, suggesting possible redundancy. Therefore, other mechanisms of type 2-mediated immune cell activation likely compensate in the absence of ILC2s. Hence, a loss of ILC2s is unlikely to increase susceptibility to, or severity of AKI
Dynamical Autler-Townes control of a phase qubit
Routers, switches, and repeaters are essential components of modern
information-processing systems. Similar devices will be needed in future
superconducting quantum computers. In this work we investigate experimentally
the time evolution of Autler-Townes splitting in a superconducting phase qubit
under the application of a control tone resonantly coupled to the second
transition. A three-level model that includes independently determined
parameters for relaxation and dephasing gives excellent agreement with the
experiment. The results demonstrate that the qubit can be used as a ON/OFF
switch with 100 ns operating time-scale for the reflection/transmission of
photons coming from an applied probe microwave tone. The ON state is realized
when the control tone is sufficiently strong to generate an Autler-Townes
doublet, suppressing the absorption of the probe tone photons and resulting in
a maximum of transmission.Comment: 8 pages, 8 figure
Nitrogen uptake and internal recycling in Zostera marina exposed to oyster farming: eelgrass potential as a natural biofilter
Oyster farming in estuaries and coastal lagoons frequently overlaps with the distribution of seagrass meadows, yet there are few studies on how this aquaculture practice affects seagrass physiology. We compared in situ nitrogen uptake and the productivity of Zostera marina shoots growing near off-bottom longlines and at a site not affected by oyster farming in San Quintin Bay, a coastal lagoon in Baja California, Mexico. We used benthic chambers to measure leaf NH4 (+) uptake capacities by pulse labeling with (NH4)-N-15 (+) and plant photosynthesis and respiration. The internal N-15 resorption/recycling was measured in shoots 2 weeks after incubations. The natural isotopic composition of eelgrass tissues and vegetative descriptors were also examined. Plants growing at the oyster farming site showed a higher leaf NH4 (+) uptake rate (33.1 mmol NH4 (+) m(-2) day(-1)) relative to those not exposed to oyster cultures (25.6 mmol NH4 (+) m(-2) day(-1)). We calculated that an eelgrass meadow of 15-16 ha (which represents only about 3-4 % of the subtidal eelgrass meadow cover in the western arm of the lagoon) can potentially incorporate the total amount of NH4 (+) excreted by oysters (similar to 5.2 x 10(6) mmol NH4 (+) day(-1)). This highlights the potential of eelgrass to act as a natural biofilter for the NH4 (+) produced by oyster farming. Shoots exposed to oysters were more efficient in re-utilizing the internal N-15 into the growth of new leaf tissues or to translocate it to belowground tissues. Photosynthetic rates were greater in shoots exposed to oysters, which is consistent with higher NH4 (+) uptake and less negative delta C-13 values. Vegetative production (shoot size, leaf growth) was also higher in these shoots. Aboveground/belowground biomass ratio was lower in eelgrass beds not directly influenced by oyster farms, likely related to the higher investment in belowground biomass to incorporate sedimentary nutrients
Towards the “ultimate earthquake-proof” building: Development of an integrated low-damage system
The 2010–2011 Canterbury earthquake sequence has highlighted the
severe mismatch between societal expectations over the reality of seismic performance
of modern buildings. A paradigm shift in performance-based design criteria
and objectives towards damage-control or low-damage design philosophy and
technologies is urgently required. The increased awareness by the general public,
tenants, building owners, territorial authorities as well as (re)insurers, of the severe
socio-economic impacts of moderate-strong earthquakes in terms of damage/dollars/
downtime, has indeed stimulated and facilitated the wider acceptance and
implementation of cost-efficient damage-control (or low-damage) technologies.
The ‘bar’ has been raised significantly with the request to fast-track the development
of what the wider general public would hope, and somehow expect, to live
in, i.e. an “earthquake-proof” building system, capable of sustaining the shaking of
a severe earthquake basically unscathed.
The paper provides an overview of recent advances through extensive research,
carried out at the University of Canterbury in the past decade towards the development
of a low-damage building system as a whole, within an integrated
performance-based framework, including the skeleton of the superstructure, the
non-structural components and the interaction with the soil/foundation system.
Examples of real on site-applications of such technology in New Zealand, using
concrete, timber (engineered wood), steel or a combination of these materials, and
featuring some of the latest innovative technical solutions developed in the laboratory
are presented as examples of successful transfer of performance-based seismic
design approach and advanced technology from theory to practice
Internal and external cooling methods and their effect on body temperature, thermal perception and dexterity
© 2018 The Authors. Published by PLOS. This is an open access article available under a Creative Commons licence.
The published version can be accessed at the following link on the publisher’s website: https://doi.org/10.1371/journal.pone.0191416© 2018 Maley et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Objective The present study aimed to compare a range of cooling methods possibly utilised by occupational workers, focusing on their effect on body temperature, perception and manual dexterity. Methods Ten male participants completed eight trials involving 30 min of seated rest followed by 30 min of cooling or control of no cooling (CON) (34C, 58% relative humidity). The cooling methods utilised were: ice cooling vest (CV0), phase change cooling vest melting at 14C (CV14), evaporative cooling vest (CVEV), arm immersion in 10C water (AI), portable water-perfused suit (WPS), heliox inhalation (HE) and ice slushy ingestion (SL). Immediately before and after cooling, participants were assessed for fine (Purdue pegboard task) and gross (grip and pinch strength) manual dexterity. Rectal and skin temperature, as well as thermal sensation and comfort, were monitored throughout. Results Compared with CON, SL was the only method to reduce rectal temperature (P = 0.012). All externally applied cooling methods reduced skin temperature (P0.05). Conclusion The present study observed that ice ingestion or ice applied to the skin produced the greatest effect on rectal and skin temperature, respectively. AI should not be utilised if workers require subsequent fine manual dexterity. These results will help inform future studies investigating appropriate pre-cooling methods for the occupational worker.This project is financially supported by the US Government through the Technical Support Working Group within the Combating Terrorism Technical Support Office.Published versio
Ethanol reversal of tolerance to the respiratory depressant effects of morphine
Opioids are the most common drugs associated with unintentional drug overdose. Death results from respiratory depression. Prolonged use of opioids results in the development of tolerance but the degree of tolerance is thought to vary between different effects of the drugs. Many opioid addicts regularly consume alcohol (ethanol), and post-mortem analyses of opioid overdose deaths have revealed an inverse correlation between blood morphine and ethanol levels. In the present study, we determined whether ethanol reduced tolerance to the respiratory depressant effects of opioids. Mice were treated with opioids (morphine, methadone, or buprenorphine) for up to 6 days. Respiration was measured in freely moving animals breathing 5% CO(2) in air in plethysmograph chambers. Antinociception (analgesia) was measured as the latency to remove the tail from a thermal stimulus. Opioid tolerance was assessed by measuring the response to a challenge dose of morphine (10 mg/kg i.p.). Tolerance developed to the respiratory depressant effect of morphine but at a slower rate than tolerance to its antinociceptive effect. A low dose of ethanol (0.3 mg/kg) alone did not depress respiration but in prolonged morphine-treated animals respiratory depression was observed when ethanol was co-administered with the morphine challenge. Ethanol did not alter the brain levels of morphine. In contrast, in methadone- or buprenorphine-treated animals no respiratory depression was observed when ethanol was co-administered along with the morphine challenge. As heroin is converted to morphine in man, selective reversal of morphine tolerance by ethanol may be a contributory factor in heroin overdose deaths
- …
