324 research outputs found
Calibration of a conceptual rainfall-runoff model for flood frequency estimation by continuous simulation
An approach is described to the calibration of a conceptual rainfall-runoff model, the Probability Distributed Model (PDM), for estimating flood frequencies at gauged sites by continuous flow simulation. A first step was the estimation of routing store parameters by recession curve analysis. Uniform random sampling was then used to search for parameter sets that produced simulations achieving the best fit to observed, hourly flow data over a 2-year period. Goodness of fit was expressed in terms of four objective functions designed to give different degrees of weight to peaks in flow. Flood frequency results were improved, if necessary, by manual adjustment of parameters, with reference to peaks extracted from the entire hourly flow record. Although the primary aim was to reproduce observed peaks, consideration was also given to finding parameter sets capable of generating a realistic overall characterization of the flow regime. Examples are shown where the calibrated model generated simulations that reproduced well the magnitude and frequency distribution of peak flows. Factors affecting the acceptability of these simulations are discussed. For an example catchment, a sensitivity analysis shows that there may be more than one set of parameter values well suited to the simulation of peak flows
Remote Sensing - based precision agriculture tool for the sugar industry
This project aimed to develop remote sensing applications that were both relevant and of commercial benefit to the Australian sugar industry and therefore adoptable. Such applications included the in season mapping of crop vigour so as to guide future management strategies, the identification of specific abiotic and biotic cropping constraints, and the conversion of GNDVI variability maps into yield at the block, farm and regional level. In order to achieve these applications the project team reviewed an array of remote sensing platforms, timing of imagery capture, software and analysis protocols; as well as distribution formats of derived imagery products, to a range of end users. The project developed strong collaborative linkages with all levels of the industry including mills, productivity services, agronomists, growers and researchers and increased its initial coverage from three individual farms in Bundaberg, Burdekin and the Herbert, coinciding with project CSE022, to include over 33,000 crops grown across 6 growing regions (Mulgrave, Herbert, Burdekin, Bundaberg, ISIS and Condong) during the 2011/2012 season
Have applications of continuous rainfall-runoff simulation realized the vision for process-based flood frequency analysis?
Keith Beven was amongst the first to propose and demonstrate a combination of conceptual rainfall–runoff modelling and stochastically generated rainfall data in what is known as the ‘continuous simulation’ approach for flood frequency analysis. The motivations included the potential to establish better links with physical processes and to avoid restrictive assumptions inherent in existing methods applied in design flood studies. Subsequently, attempts have been made to establish continuous simulation as a routine method for flood frequency analysis, particularly in the UK. The approach has not been adopted universally, but numerous studies have benefitted from applications of continuous simulation methods. This paper asks whether industry has yet realized the vision of the pioneering research by Beven and others. It reviews the generic methodology and illustrates applications of the original vision for a more physically realistic approach to flood frequency analysis through a set of practical case studies, highlighting why continuous simulation was useful and appropriate in each case. The case studies illustrate how continuous simulation has helped to offer users of flood frequency analysis more confidence about model results by avoiding (or exposing) bad assumptions relating to catchment heterogeneity, inappropriateness of assumptions made in (UK) industrystandard design event flood estimation methods, and the representation of engineered or natural dynamic controls on flood flows. By implementing the vision for physically realistic analysis of flood frequency through continuous simulation, each of these examples illustrates how more relevant and improved information was provided for flood risk decision-making than would have been possible using standard methods. They further demonstrate that integrating engineered infrastructure into flood frequency analysis and assessment of environmental change are also significant motivations for adopting the continuous simulation approach in practic
Improving statistical models for flood risk assessment
Widespread flooding, such as the events in the winter of 2013/2014 in the UK and early summer 2013 in Cent ral Europe, demonst rate clearly how important it is to understand the characterist ics of floods in which mult iple locat ions experience ext reme river flows. Recent developments in mult ivariate stat ist ical modelling help to place such events in a probabilist ic framework. It is now possible to perform joint probability analysis of events defined in terms of physical variables at hundreds of locat ions simultaneously, over mult iple variables (including river flows, rainfall and sea levels), combined with analysis of temporal dependence to capture the evolut ion of events over a large domain. Crit ical const raints on such data-driven methods are the problems of missing data, especially where records over a network are not all concurrent , the joint analysis of several different physical variables, and the choice of suitable t ime scales when combining informat ion from those variables. This paper presents new developments of a high-dimensional condit ional probability model for ext reme river flow events condit ioned on flow and r ainfall observat ions. These are: a new computat ionally efficient paramet ric approach to account for missing data in the joint analysis of ext remes over a large hydromet ric network; a robust approach for the spat ial interpolation of extreme events throughout a large river network,; generat ion of realist ic est imates of ext remes at ungauged locat ions; and, exploit ing rainfall information rat ionally within the stat ist ical model to help improve efficiency. These methodological advances will be illust rated with data from the UK river network and recent events to show how they cont ribute to a flexible and effective framework for flood risk assessment, with applicat ions in the insurance sector and for nat ional-scale emergency planning
Developing sustainability pathways for social simulation tools and services
The use of cloud technologies to teach agent-based modelling and simulation (ABMS) is an interesting application of a nascent technological paradigm that has received very little attention in the literature. This report fills that gap and aims to help instructors, teachers and demonstrators to understand why and how cloud services are appropriate solutions to common problems they face delivering their study programmes, as well as outlining the many cloud options available. The report first introduces social simulation and considers how social simulation is taught. Following this factors affecting the implementation of agent-based models are explored, with attention focused primarily on the modelling and execution platforms currently available, the challenges associated with implementing agent-based models, and the technical architectures that can be used to support the modelling, simulation and teaching process. This sets the context for an extended discussion on cloud computing including service and deployment models, accessing cloud resources, the financial implications of adopting the cloud, and an introduction to the evaluation of cloud services within the context of developing, executing and teaching agent-based models
Recommended from our members
Development of large scale inland flood scenarios for disaster response planning based on spatial/temporal conditional probability analysis
Extreme event scenarios are useful for civil emergency services to help in developing contingency plans for responding effectively to major flooding incidents. In the UK, the official national risk register includes a scenario for inland flooding (from rivers and other sources), which is described in terms of a probability of occurrence over a five year period of between 1 in 200 and 1 in 20. This scenario was previously based on recent extreme floods, in conjunction with maps produced to aid in development planning on floodplains. At the time it was constructed, it was not feasible to assess scientifically the combined probability of a nationally-significant flood event of this type, therefore the scenario probability assessment was ambiguous.
Recent developments in multivariate extreme value statistics now allow the probability of large scale flood events to be assessed with reference to hydrological summary statistics or impact metrics. Building on theory and pilot studies by Heffernan and Tawn [1], Lamb et al. [2] and Keef et al. [3], we describe the development of a set of national-scale scenarios based on a high-dimensional (ca. 1,100 locations) conditional probability analysis of extreme river flows and rainfall. The methodology provides a theoretically justified basis for extrapolation into the joint tail of the distribution of these variables, which is then used to simulate extreme events with associated probabilities. The probabilistic events are compared with current understanding of meteorological scenarios associated with significant, large-scale flooding in the UK, and with historical flooding, in order to identify plausible events that can inform national risk scenarios. Additionally, we combined scenarios of inland and coastal extremes that have been considered by linking the analysis discussed in this paper with methods presented in a companion paper by Wyncoll et al
Fallow deer (Dama dama dama) management in Roman South-East Britain
This paper presents new carbon, nitrogen and sulphur isotope data for European fallow deer (Dama dama dama) in Roman Britain and discusses results in light of evidence from classical texts, landscape archaeology, zooarchaeology and the limited available samples of metric data. The new isotope data presented here are from Fishbourne Roman Palace (Sussex), two sites on the Isle of Thanet (Kent) and a further two sites in London. In spite of small sample sizes the data make an important contribution to the very limited corpus of scientific research on the species and provide new resolution to the nature of fallow deer movement and management in Roman Britain
Myocardial scar surface area identified by LGE MRI is an independent predictor of mortality in post-infarction patients
Online Information Resource Mediation of Interorganizational RelationshipsA Work-In-Progress Technical Research Synopsis
As we design and construct new technologies to expand and transform global information infrastructures, we will undoubtedly face new challenges, but we can also expect to encounter some of the same problems that have shaped and constrained existing information technologies.Many of these challenges are technological. Others are environmental or organizational in nature. The study that we describe in this paper is designed to examine one particular information technology--online information resources. We focus on the ways in which people within organizations use these services in their day-to-day work, and we examine the roles that these resources play in the routine activities which mediate interorganizational relationships. Online information (OI) resources have been part of the information infrastructure since the early 1970\u27s. They are curated collections of indexed electronic databases with supporting distribution services. Online service vendors have traditionally provided fee-for-service modem access to mainframes containing these databases of strategic business, scientific, legal and financial information. Initially, the services were entirely text-based. Now, most OI service providers supplement their mainframe offerings with GUI-enhanced CD-ROM products. They have also begun to provide additional access points via consumer utilities like CompuServe and America OnLine, and the World Wide Web. Systematic studies of commercial uses of OI resources show that particular institutions, such as the legal, financial and biotechnology communities, use OI resources much more than other institutions. And particular ways of using OI resources, such as via information centers and intermediaries, seem to be more common than others. However, we have found in our prior research that conceptions of OI resource usability and usage patterns, which characterize OI resource use as intensive, direct and non-intermediated, do not match observed use; and pressures to conform to these expectations rather than the actualities are perceived, by OIresource providers and information specialist intermediaries, to be increasing. Without an adequate understanding of successful current use, significant modification of those patterns is likely to prove difficult, unnecessary, counter-productive, or all of the above. In order to develop a better understanding of successful current use, we have begun an interpretive study of the ways in which OI resources are used, when they are used, how intensively they are used and by whom. This study will help to identify how organizations make use of OI resources. It will extend our understanding of information resource usage within the domain of networked technologies, as we describe and examine the interorganizational relationships which involve the use of these resource
- …
