580 research outputs found
Soil methane sink capacity response to a long-term wildfire chronosequence in Northern Sweden
Boreal forests occupy nearly one fifth of the terrestrial land surface and are recognised as globally important regulators of carbon (C) cycling and greenhouse gas emissions. Carbon sequestration processes in these forests include assimilation of CO2 into biomass and subsequently into soil organic matter, and soil microbial oxidation of methane (CH4). In this study we explored how ecosystem retrogression, which drives vegetation change, regulates the important process of soil CH4 oxidation in boreal forests. We measured soil CH4 oxidation processes on a group of 30 forested islands in northern Sweden differing greatly in fire history, and collectively representing a retrogressive chronosequence, spanning 5000 years. Across these islands the build-up of soil organic matter was observed to increase with time since fire disturbance, with a significant correlation between greater humus depth and increased net soil CH4 oxidation rates. We suggest that this increase in net CH4 oxidation rates, in the absence of disturbance, results as deeper humus stores accumulate and provide niches for methanotrophs to thrive. By using this gradient we have discovered important regulatory controls on the stability of soil CH4 oxidation processes that could not have not been explored through shorter-term experiments. Our findings indicate that in the absence of human interventions such as fire suppression, and with increased wildfire frequency, the globally important boreal CH4 sink could be diminished
Detectable Anthropogenic Shift toward Heavy Precipitation over Eastern China
Changes in precipitation characteristics directly affect society through their impacts on drought and floods, hydro-dams, and urban drainage systems. Global warming increases the water holding capacity of the atmosphere and thus the risk of heavy precipitation. Here, daily precipitation records from over 700 Chinese stations from 1956 to 2005 are analyzed. The results show a significant shift from light to heavy precipitation over eastern China. An optimal fingerprinting analysis of simulations from 11 climate models driven by different combinations of historical anthropogenic (greenhouse gases, aerosols, land use, and ozone) and natural (volcanic and solar) forcings indicates that anthropogenic forcing on climate, including increases in greenhouse gases (GHGs), has had a detectable contribution to the observed shift toward heavy precipitation. Some evidence is found that anthropogenic aerosols (AAs) partially offset the effect of the GHG forcing, resulting in a weaker shift toward heavy precipitation in simulations that include the AA forcing than in simulations with only the GHG forcing. In addition to the thermodynamic mechanism, strengthened water vapor transport from the adjacent oceans and by midlatitude westerlies, resulting mainly from GHG-induced warming, also favors heavy precipitation over eastern China. Further GHG-induced warming is predicted to lead to an increasing shift toward heavy precipitation, leading to increased urban flooding and posing a significant challenge for mega-cities in China in the coming decades. Future reductions in AA emissions resulting from air pollution controls could exacerbate this tendency toward heavier precipitation
DADA: data assimilation for the detection and attribution of weather and climate-related events
A new nudging method for data assimilation, delay‐coordinate nudging, is presented. Delay‐coordinate nudging makes explicit use of present and past observations in the formulation of the forcing driving the model evolution at each time step. Numerical experiments with a low‐order chaotic system show that the new method systematically outperforms standard nudging in different model and observational scenarios, also when using an unoptimized formulation of the delay‐nudging coefficients. A connection between the optimal delay and the dominant Lyapunov exponent of the dynamics is found based on heuristic arguments and is confirmed by the numerical results, providing a guideline for the practical implementation of the algorithm. Delay‐coordinate nudging preserves the easiness of implementation, the intuitive functioning and the reduced computational cost of the standard nudging, making it a potential alternative especially in the field of seasonal‐to‐decadal predictions with large Earth system models that limit the use of more sophisticated data assimilation procedures
Early assembly of the most massive galaxies
The current consensus is that galaxies begin as small density fluctuations in the early Universe and grow by in situ star formation and hierarchical merging(1). Stars begin to form relatively quickly in sub-galactic-sized building blocks called haloes which are subsequently assembled into galaxies. However, exactly when this assembly takes place is a matter of some debate(2,3). Here we report that the stellar masses of brightest cluster galaxies, which are the most luminous objects emitting stellar light, some 9 billion years ago are not significantly different from their stellar masses today. Brightest cluster galaxies are almost fully assembled 425 billion years after the Big Bang, having grown to more than 90 per cent of their final stellar mass by this time. Our data conflict with the most recent galaxy formation models(4,5) based on the largest simulations of dark-matter halo development(1). These models predict protracted formation of brightest cluster galaxies over a Hubble time, with only 22 per cent of the stellar mass assembled at the epoch probed by our sample. Our findings suggest a new picture in which brightest cluster galaxies experience an early period of rapid growth rather than prolonged hierarchical assembly
Thermotomaculum hydrothermale gen. nov., sp. nov., a novel heterotrophic thermophile within the phylum Acidobacteria from a deep-sea hydrothermal vent chimney in the Southern Okinawa Trough
http://www.godac.jamstec.go.jp/darwin/cruise/natsushima/nt08-13/
Recommended from our members
Attribution: how is it relevant for loss and damage policy and practice?
Attribution has become a recurring issue in discussions about Loss and Damage (L&D). In this highly-politicised context, attribution is often associated with responsibility and blame; and linked to debates about liability and compensation. The aim of attribution science, however, is not to establish responsibility, but to further scientific understanding of causal links between elements of the Earth System and society. This research into causality could inform the management of climate-related risks through improved understanding of drivers of relevant hazards, or, more widely, vulnerability and exposure; with potential benefits regardless of political positions on L&D. Experience shows that it is nevertheless difficult to have open discussions about the science in the policy sphere. This is not only a missed opportunity, but also problematic in that it could inhibit understanding of scientific results and uncertainties, potentially leading to policy planning which does not have sufficient scientific evidence to support it. In this chapter, we first explore this dilemma for science-policy dialogue, summarising several years of research into stakeholder perspectives of attribution in the context of L&D. We then aim to provide clarity about the scientific research available, through an overview of research which might contribute evidence about the causal connections between anthropogenic climate change and losses and damages, including climate science, but also other fields which examine other drivers of hazard, exposure, and vulnerability. Finally, we explore potential applications of attribution research, suggesting that an integrated and nuanced approach has potential to inform planning to avert, minimise and address losses and damages. The key messages are
In the political context of climate negotiations, questions about whether losses and damages can be attributed to anthropogenic climate change are often linked to issues of responsibility, blame, and liability.
Attribution science does not aim to establish responsibility or blame, but rather to investigate drivers of change.
Attribution science is advancing rapidly, and has potential to increase understanding of how climate variability and change is influencing slow onset and extreme weather events, and how this interacts with other drivers of risk, including socio-economic drivers, to influence losses and damages.
Over time, some uncertainties in the science will be reduced, as the anthropogenic climate change signal becomes stronger, and understanding of climate variability and change develops.
However, some uncertainties will not be eliminated. Uncertainty is common in science, and does not prevent useful applications in policy, but might determine which applications are appropriate. It is important to highlight that in attribution studies, the strength of evidence varies substantially between different kinds of slow onset and extreme weather events, and between regions. Policy-makers should not expect the later emergence of conclusive evidence about the influence of climate variability and change on specific incidences of losses and damages; and, in particular, should not expect the strength of evidence to be equal between events, and between countries.
Rather than waiting for further confidence in attribution studies, there is potential to start working now to integrate science into policy and practice, to help understand and tackle drivers of losses and damages, informing prevention, recovery, rehabilitation, and transformation
Recommended from our members
Towards a typology for constrained climate model forecasts
In recent years several methodologies have been developed to combine and interpret ensembles of climate models with the aim of quantifying uncertainties in climate projections. Constrained climate model forecasts have been generated by combining various choices of metrics used to weight individual ensemble members, with diverse approaches to sampling the ensemble. The forecasts obtained are often significantly different, even when based on the same model output. Therefore, a climate model forecast classification system can serve two roles: to provide a way for forecast producers to self-classify their forecasts; and to provide information on the methodological assumptions underlying the forecast generation and its uncertainty when forecasts are used for impacts studies. In this review we propose a possible classification system based on choices of metrics and sampling strategies. We illustrate the impact of some of the possible choices in the uncertainty quantification of large scale projections of temperature and precipitation changes, and briefly discuss possible connections between climate forecast uncertainty quantification and decision making approaches in the climate change context
Recommended from our members
Detection and attribution of human influence on regional precipitation
Understanding how human influence on climate is affecting precipitation around the world is immensely important for defining mitigation policies, and for adaptation planning. Yet despite increasing evidence for the influence of climate change on global patterns of precipitation, and expectations that significant changes in regional precipitation should have already occurred as a result of human influence on climate, compelling evidence of anthropogenic fingerprints on regional precipitation is obscured by observational and modelling uncertainties and is likely to remain so using current methods for years to come. This is in spite of substantial ongoing improvements in models, new reanalyses and a satellite record that spans over thirty years. If we are to quantify how human-induced climate change is affecting the regional water cycle, we need to consider novel ways of identifying the effects of natural and anthropogenic influences on precipitation that take full advantage of our physical expectations
The XMM Cluster Survey: evolution of the velocity dispersion–temperature relation over half a Hubble time
We measure the evolution of the velocity dispersion–temperature (σv–TX) relation up to z = 1 using a sample of 38 galaxy clusters drawn from the XMM Cluster Survey. This work improves upon previous studies by the use of a homogeneous cluster sample and in terms of the number of high-redshift clusters included. We present here new redshift and velocity dispersion measurements for 12 z > 0.5 clusters observed with the Gemini Multi Object
Spectographs instruments on the Gemini telescopes. Using an orthogonal regression method,we find that the slope of the relation is steeper than that expected if clusters were self-similar, and that the evolution of the normalization is slightly negative, but not significantly different from zero (σv ∝T0.86±0.14E(z)−0.37±0.33). We verify our results by applying our methods to cosmological hydrodynamical simulations. The lack of evolution seen in our data is consistent with simulations that include both feedback and radiative cooling
Recommended from our members
A common framework for approaches to extreme event attribution
The extent to which a given extreme weather or climate event is attributable to anthropogenic climate change
is a question of considerable public interest. From a scientific perspective, the question can be framed in various ways, and the answer depends very much on the framing. One such framing is a risk-based approach, which answers the question probabilistically, in terms of a change in likelihood of a class of event similar to the one in question, and natural variability is treated as noise. A rather different framing is a storyline approach, which examines the role of the various factors contributing
to the event as it unfolded, including the anomalous
aspects of natural variability, and answers the question deterministically. It is argued that these two apparently irreconcilable approaches can be viewed within a common framework, where the most useful level of conditioning will depend on the question being asked and the uncertainties involved
- …
