331 research outputs found

    Stochastic Differential Games and Viscosity Solutions of Hamilton-Jacobi-Bellman-Isaacs Equations

    Full text link
    In this paper we study zero-sum two-player stochastic differential games with the help of theory of Backward Stochastic Differential Equations (BSDEs). At the one hand we generalize the results of the pioneer work of Fleming and Souganidis by considering cost functionals defined by controlled BSDEs and by allowing the admissible control processes to depend on events occurring before the beginning of the game (which implies that the cost functionals become random variables), on the other hand the application of BSDE methods, in particular that of the notion of stochastic "backward semigroups" introduced by Peng allows to prove a dynamic programming principle for the upper and the lower value functions of the game in a straight-forward way, without passing by additional approximations. The upper and the lower value functions are proved to be the unique viscosity solutions of the upper and the lower Hamilton-Jacobi-Bellman-Isaacs equations, respectively. For this Peng's BSDE method is translated from the framework of stochastic control theory into that of stochastic differential games.Comment: The results were presented by Rainer Buckdahn at the "12th International Symposium on Dynamic Games and Applications" in Sophia-Antipolis (France) in June 2006; They were also reported by Juan Li at 2nd Workshop on "Stochastic Equations and Related Topics" in Jena (Germany) in July 2006 and at one seminar in the ETH of Zurich in November 200

    Stochastic control problems for systems driven by normal martingales

    Full text link
    In this paper we study a class of stochastic control problems in which the control of the jump size is essential. Such a model is a generalized version for various applied problems ranging from optimal reinsurance selections for general insurance models to queueing theory. The main novel point of such a control problem is that by changing the jump size of the system, one essentially changes the type of the driving martingale. Such a feature does not seem to have been investigated in any existing stochastic control literature. We shall first provide a rigorous theoretical foundation for the control problem by establishing an existence result for the multidimensional structure equation on a Wiener--Poisson space, given an arbitrary bounded jump size control process; and by providing an auxiliary counterexample showing the nonuniqueness for such solutions. Based on these theoretical results, we then formulate the control problem and prove the Bellman principle, and derive the corresponding Hamilton--Jacobi--Bellman (HJB) equation, which in this case is a mixed second-order partial differential/difference equation. Finally, we prove a uniqueness result for the viscosity solution of such an HJB equation.Comment: Published in at http://dx.doi.org/10.1214/07-AAP467 the Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Integral-Partial Differential Equations of Isaacs' Type Related to Stochastic Differential Games with Jumps

    Full text link
    In this paper we study zero-sum two-player stochastic differential games with jumps with the help of theory of Backward Stochastic Differential Equations (BSDEs). We generalize the results of Fleming and Souganidis [10] and those by Biswas [3] by considering a controlled stochastic system driven by a d-dimensional Brownian motion and a Poisson random measure and by associating nonlinear cost functionals defined by controlled BSDEs. Moreover, unlike the both papers cited above we allow the admissible control processes of both players to depend on all events occurring before the beginning of the game. This quite natural extension allows the players to take into account such earlier events, and it makes even easier to derive the dynamic programming principle. The price to pay is that the cost functionals become random variables and so also the upper and the lower value functions of the game are a priori random fields. The use of a new method allows to prove that, in fact, the upper and the lower value functions are deterministic. On the other hand, the application of BSDE methods [18] allows to prove a dynamic programming principle for the upper and the lower value functions in a very straight-forward way, as well as the fact that they are the unique viscosity solutions of the upper and the lower integral-partial differential equations of Hamilton-Jacobi-Bellman-Isaacs' type, respectively. Finally, the existence of the value of the game is got in this more general setting if Isaacs' condition holds.Comment: 30 pages

    Stochastic Verification Theorem of Forward-Backward Controlled Systems for Viscosity Solutions

    Full text link
    In this paper, we investigate the controlled system described by forward-backward stochastic differential equations with the control contained in drift, diffusion and generator of BSDE. A new verification theorem is derived within the framework of viscosity solutions without involving any derivatives of the value functions. It is worth to pointing out that this theorem has wider applicability than the restrictive classical verification theorems. As a relevant problem, the optimal stochastic feedback controls for forward-backward system are discussed as well
    corecore