1,119 research outputs found
Policy Gradients for CVaR-Constrained MDPs
We study a risk-constrained version of the stochastic shortest path (SSP)
problem, where the risk measure considered is Conditional Value-at-Risk (CVaR).
We propose two algorithms that obtain a locally risk-optimal policy by
employing four tools: stochastic approximation, mini batches, policy gradients
and importance sampling. Both the algorithms incorporate a CVaR estimation
procedure, along the lines of Bardou et al. [2009], which in turn is based on
Rockafellar-Uryasev's representation for CVaR and utilize the likelihood ratio
principle for estimating the gradient of the sum of one cost function
(objective of the SSP) and the gradient of the CVaR of the sum of another cost
function (in the constraint of SSP). The algorithms differ in the manner in
which they approximate the CVaR estimates/necessary gradients - the first
algorithm uses stochastic approximation, while the second employ mini-batches
in the spirit of Monte Carlo methods. We establish asymptotic convergence of
both the algorithms. Further, since estimating CVaR is related to rare-event
simulation, we incorporate an importance sampling based variance reduction
scheme into our proposed algorithms
Optimal consumption and investment with bounded downside risk for power utility functions
We investigate optimal consumption and investment problems for a
Black-Scholes market under uniform restrictions on Value-at-Risk and Expected
Shortfall. We formulate various utility maximization problems, which can be
solved explicitly. We compare the optimal solutions in form of optimal value,
optimal control and optimal wealth to analogous problems under additional
uniform risk bounds. Our proofs are partly based on solutions to
Hamilton-Jacobi-Bellman equations, and we prove a corresponding verification
theorem. This work was supported by the European Science Foundation through the
AMaMeF programme.Comment: 36 page
CVaR minimization by the SRA algorithm
Using the risk measure CV aR in �nancial analysis has become
more and more popular recently. In this paper we apply CV aR for portfolio optimization. The problem is formulated as a two-stage stochastic programming model, and the SRA algorithm, a recently developed heuristic algorithm, is applied for minimizing CV aR
Inf-convolution of G-expectations
In this paper we will discuss the optimal risk transfer problems when risk
measures are generated by G-expectations, and we present the relationship
between inf-convolution of G-expectations and the inf-convolution of drivers G.Comment: 23 page
Sliding Phases in XY-Models, Crystals, and Cationic Lipid-DNA Complexes
We predict the existence of a totally new class of phases in weakly coupled,
three-dimensional stacks of two-dimensional (2D) XY-models. These ``sliding
phases'' behave essentially like decoupled, independent 2D XY-models with
precisely zero free energy cost associated with rotating spins in one layer
relative to those in neighboring layers. As a result, the two-point spin
correlation function decays algebraically with in-plane separation. Our
results, which contradict past studies because we include higher-gradient
couplings between layers, also apply to crystals and may explain recently
observed behavior in cationic lipid-DNA complexes.Comment: 4 pages of double column text in REVTEX format and 1 postscript
figur
Evolutionary multi-stage financial scenario tree generation
Multi-stage financial decision optimization under uncertainty depends on a
careful numerical approximation of the underlying stochastic process, which
describes the future returns of the selected assets or asset categories.
Various approaches towards an optimal generation of discrete-time,
discrete-state approximations (represented as scenario trees) have been
suggested in the literature. In this paper, a new evolutionary algorithm to
create scenario trees for multi-stage financial optimization models will be
presented. Numerical results and implementation details conclude the paper
EIT: Solar corona synoptic observations from SOHO with an Extreme-ultraviolet Imaging Telescope
The Extreme-ultraviolet Imaging Telescope (EIT) of SOHO (solar and heliospheric observatory) will provide full disk images in emission lines formed at temperatures that map solar structures ranging from the chromospheric network to the hot magnetically confined plasma in the corona. Images in four narrow bandpasses will be obtained using normal incidence multilayered optics deposited on quadrants of a Ritchey-Chretien telescope. The EIT is capable of providing a uniform one arc second resolution over its entire 50 by 50 arc min field of view. Data from the EIT will be extremely valuable for identifying and interpreting the spatial and temperature fine structures of the solar atmosphere. Temporal analysis will provide information on the stability of these structures and identify dynamical processes. EIT images, issued daily, will provide the global corona context for aid in unifying the investigations and in forming the observing plans for SOHO coronal instruments
Multivariate risks and depth-trimmed regions
We describe a general framework for measuring risks, where the risk measure
takes values in an abstract cone. It is shown that this approach naturally
includes the classical risk measures and set-valued risk measures and yields a
natural definition of vector-valued risk measures. Several main constructions
of risk measures are described in this abstract axiomatic framework.
It is shown that the concept of depth-trimmed (or central) regions from the
multivariate statistics is closely related to the definition of risk measures.
In particular, the halfspace trimming corresponds to the Value-at-Risk, while
the zonoid trimming yields the expected shortfall. In the abstract framework,
it is shown how to establish a both-ways correspondence between risk measures
and depth-trimmed regions. It is also demonstrated how the lattice structure of
the space of risk values influences this relationship.Comment: 26 pages. Substantially revised version with a number of new results
adde
HMM based scenario generation for an investment optimisation problem
This is the post-print version of the article. The official published version can be accessed from the link below - Copyright @ 2012 Springer-Verlag.The Geometric Brownian motion (GBM) is a standard method for modelling financial time series. An important criticism of this method is that the parameters of the GBM are assumed to be constants; due to this fact, important features of the time series, like extreme behaviour or volatility clustering cannot be captured. We propose an approach by which the parameters of the GBM are able to switch between regimes, more precisely they are governed by a hidden Markov chain. Thus, we model the financial time series via a hidden Markov model (HMM) with a GBM in each state. Using this approach, we generate scenarios for a financial portfolio optimisation problem in which the portfolio CVaR is minimised. Numerical results are presented.This study was funded by NET ACE at OptiRisk Systems
Modelling stochastic bivariate mortality
Stochastic mortality, i.e. modelling death arrival via a jump process with stochastic intensity, is gaining increasing reputation as a way to represent mortality risk. This paper represents a first attempt to model the mortality risk of couples of individuals, according to the stochastic intensity approach.
On the theoretical side, we extend to couples the Cox processes set up, i.e. the idea that mortality is driven by a jump process whose intensity is itself a stochastic process, proper of a particular generation within each gender. Dependence between the survival times of the members of a couple is captured by an Archimedean copula.
On the calibration side, we fit the joint survival function by calibrating separately the (analytical) copula and the (analytical) margins. First, we select the best fit copula according to the methodology of Wang and Wells (2000) for censored data. Then, we provide a sample-based calibration for the intensity, using a time-homogeneous, non mean-reverting, affine process: this gives the analytical marginal survival functions. Coupling the best fit copula with the calibrated margins we obtain, on a sample generation, a joint survival function which incorporates the stochastic nature of mortality improvements and is far from representing independency.On the contrary, since the best fit copula turns out to be a Nelsen one, dependency is increasing with age and long-term dependence exists
- …
