1,109 research outputs found

    On the Economics of Ramping Rate Restrictions at Hydro Power Plants: Balancing Profitability and Environmental Costs

    Get PDF
    This paper examines the impact of ramping rate restrictions imposed on hydro operations to protect aquatic ecosystems. A dynamic optimization model of the profit maximizing decisions of a hydro operator is solved for various restrictions on water flow, using data for a representative hydro operation in Ontario. Profits are negatively affected, but for a range of restrictions the impact is not large. Ramping restrictions cause a redistribution of hydro production over a given day, which can result in an increase in total hydro power produced. This affects the need for power from other sources with consequent environmental impacts.

    Contrasting two approaches in real options valuation: contingent claims versus dynamic programming

    Get PDF
    This paper compares two well-known approaches for valuing a risky investment using real options theory: contingent claims (CC) with risk neutral valuation and dynamic programming (DP) using a constant risk adjusted discount rate. Both approaches have been used in valuing forest assets. A proof is presented which shows that, except under certain restrictive assumptions; DP using a constant discount rate and CC will not yield the same answers for investment value. A few special cases are considered for which CC and DP with a constant discount rate are consistent with each other. An optimal tree harvesting example is presented to illustrate that the values obtained using the two approaches can differ whcn we depart from these special cases to a more realistic scenariio. Further, the implied risk adjusted discount rate calculated from CC is found to vary with the stochastic state variable and stand age. We conclude that for real options problems the CC approach should be used.optimal tree harvesting, real options, contingent claims, dynamic programming

    The Impact of Stochastic Convenience Yield on Long-term Forestry Investment Decisions

    Get PDF
    This paper investigates whether convenience yield is an important factor in determining optimal decisions for a forestry investment. The Kalman filter method is used to estimate three different models of lumber prices: a mean reverting model, a simple geometric Brownian motion and the two-factor price model due to Schwartz (1997). In the latter model there are two correlated stochastic factors: spot price and convenience yield. The two-factor model is shown to provide a reasonable fit of the term structure of lumber futures prices. The impact of convenience yield on a forestry investment decision is examined using the Schwartz (1997) long-term model which transforms the two-factor price model into a single factor model with a composite price. Using the long-term model an optimal harvesting problem is analyzed, which requires the numerical solution of an impulse control problem formulated as a Hamilton-Jacobi-Bellman Variational Inequality. We compare the results for the long-term model to those from single-factor mean reverting and geometric Brownian motion models. The inclusion of convenience yield through the long-term model is found to have a significant impact on land value and optimal harvesting decisions.

    Regime switching in stochastic models of commodity prices: An application to an optimal tree harvesting problem

    Get PDF
    This paper investigates a regime switching model of stochastic lumber prices in the context of an optimal tree harvesting problem. Using lumber derivatives prices, two lumber price models are calibrated: a regime switching model and a single regime model. In the regime switching model, the lumber price can be in one of two regimes in which different mean reverting price processes prevail. An optimal tree harvesting problem is specified in terms of a linear complementarity problem which is solved using a fully implicit finite difference, fully-coupled, numerical approach. The land value and critical harvesting prices are found to be significantly different depending on which price model is used. The regime switching model shows promise as a parsimonious model of timber prices that can be incorporated into forestry investment problems.optimal tree harvesting, regime switching, calibration, lumber derivatives prices, fully implicit finite difference approach

    The Universe at Extreme Scale: Multi-Petaflop Sky Simulation on the BG/Q

    Full text link
    Remarkable observational advances have established a compelling cross-validated model of the Universe. Yet, two key pillars of this model -- dark matter and dark energy -- remain mysterious. Sky surveys that map billions of galaxies to explore the `Dark Universe', demand a corresponding extreme-scale simulation capability; the HACC (Hybrid/Hardware Accelerated Cosmology Code) framework has been designed to deliver this level of performance now, and into the future. With its novel algorithmic structure, HACC allows flexible tuning across diverse architectures, including accelerated and multi-core systems. On the IBM BG/Q, HACC attains unprecedented scalable performance -- currently 13.94 PFlops at 69.2% of peak and 90% parallel efficiency on 1,572,864 cores with an equal number of MPI ranks, and a concurrency of 6.3 million. This level of performance was achieved at extreme problem sizes, including a benchmark run with more than 3.6 trillion particles, significantly larger than any cosmological simulation yet performed.Comment: 11 pages, 11 figures, final version of paper for talk presented at SC1

    NASA Accident Precursor Analysis Handbook, Version 1.0

    Get PDF
    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare
    corecore