On the Modelling of Impulse Control with Random Effects for Continuous Markov Processes

22 Aug 2019  ·  K. L. Helmes, R. H. Stockbridge, C. Zhu ·

The use of coordinate processes for the modelling of impulse control for general Markov processes typically involves the construction of a probability measure on a countable product of copies of the path space. In addition, admissibility of an impulse control policy requires that the random times of the interventions be stopping times with respect to different filtrations arising from the different component coordinate processes. When the underlying Markov process has continuous paths, however, a simpler model can be developed which takes the single path space as its probability space and uses the natural filtration with respect to which the intervention times must be stopping times. Moreover, this model construction allows for impulse control with random effects whereby the decision maker selects a distribution of the new state. This paper gives the construction of the probability measure on the path space for an admissible intervention policy subject to a randomized impulse mechanism. In addition, a class of polices is defined for which the paths between interventions are independent and a further subclass for which the cycles following the initial cycle are identically distributed. A benefit of this smaller subclass of policies is that one is allowed to use classical renewal arguments to analyze long-term average control problems. Further, the paper defines a class of {\em stationary}\/ impulse policies for which the family of models gives a Markov family. The decision to use an $(s,S)$ ordering policy in inventory management provides an example of an impulse policy for which the process has i.i.d.~cycles and the family of models forms a Markov family.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Optimization and Control Probability 93E20, 60H30