Zeitschrift for
Z. Wahrscheinlichkeitstheorieverw. Gebiete
41, 39-58 (1977)
Wahrscheinlichkeitstheorie
und verwandte Gebiete
9 by Springer-Verlag 1977
Markov Solutions of Stochastic Differential Equations
Philip Protter
Mathematics Department, Duke University,Durham, N.C. 27706, USA
1. Introduction
Let (Wt) be the Wiener process. Then it is well known that for Lipschitz continuous
f, g, the stochastic differential equation of K. It6
t
(1.1)
Xt=Xo + S f(s, Xs)dW,+ i g(s, Xslds
0
0
has a unique solution which is a Markov process with continuous paths. Moreover
if f and g in (1.1) satisfy f(t,x)=f(x), g(t,x)=g(x) (i.e., they are autonomous)
then X is a time homogeneous strong M a r k o v process (cf., e.g. [10]). Kunita
and Watanabe [14], Dol6ans-Dade [7], and Meyer [7, 16] have developed a
martingale integral which includes It6's integral for the Wiener process. C. Doldans-Dade [9] and the author [17, 18] have shown that unique solutions exist
for equations of the form
t
(1.2)
Xt=Xo+Sf(s,X~_)dZs+ig(s,X,_)dAs
0
0
where f and g are (say) jointly continuous and Lipschitz in the space variable,
and Z and A are semimartingales. In this paper we will be interested in the cases
where Z is a Markov process and a semimartingale, and A, is an additive functional of Z. This allows one to consider models in which the (random) driving
term is not white noise but, for example, only has stationary, independent increments (and so may have jumps), or is simply a Markov (e.g., Hunt) process. In
this paper we determine the nature of the Markov properties which solutions of
(1.2) have.
If Z=(Y2, ~,, ~tt, Or,pz, Z,) is a Hunt process [2, p. 45], in order for Equations (1.2) to be meaningful, Z must be a P~ semimartingale. If Z is a P= semimartingale for every z, then a priori it may have a different decomposition for
every z. In Section 3 we give a sufficient condition for Z to have a decomposition
which holds for all P~ simultaneously. We call such a process a universal semi-
40
P. Protter
martingale. We show that if a universal semimartingale satisfies an additional
condition then it remains a universal semimartingale after a change of time.
In Section 4 we prove several technical lemmas involving properties of solutions of equations of the form (1.2) as well as lemmas that arise from the Markov
framework: the uncountable family of measures (W), zslR. We show that the
solution X of (1.2) is independent of z.
In Section 5 we show that if A t = t and Z has independent increments and is
a semimartingale then the solution X of (1.2) is strong Markov. Theorem (5.3)
treats the cases in which Z either has independent increments alone or is a L6vy
process. This result extends a classical result of It6. Theorem (5.8) shows that if
Z is merely assumed to be a Markov process then the vector process (X, Z)
will be Markov, though X in (1.2) need not be Markov. Perhaps the most interesting situation is that treated in Theorem (5.9) which considers the case where
Z is a Hunt process and f and g are autonomous. Then the vector process (X, Z)
is a time-homogeneous strong Markov process with transition semigroup
Pt h (x, z) = E x'~ [h (X~, Zt) ].
One can define a shift operator for the vector process (X, Z) of Theorem (5.9)
and if one assumes the additive functional (As) in (1.2) is quasi-left-continuous
then the process (X, Z) is a Hunt process. In Section 6 we calculate explicitly a
L6vy system of (X, Z) in terms of the coefficients f, g, the jumps of (At), and a
L6vy system of Z.
I wish to thank R.K. Getoor for suggesting this investigation, and J, Jacod, M.J. Sharpe and
R. L. Wolpert for several helpful suggestions.
2. Preliminaries
Although we assume that the reader is familiar with the stochastic integral for
local martingales given in Kunita and Watanabe [14], Dol6ans-Dade [7], and
Meyer [7, 16], we restate here some of the important definitions in this development. Let (~2, g , 4 , P) be a probability space where the filtration ( 4 ) is complete
and right continuous. A right continuous adapted process M t with left limits
and M o = 0 is a local martingale if there exist stopping times T" increasing to oe
such that M T" is a uniformly integrable martingale for each n. The stopping
times (T") are said to reduce M. We let ~U be the class of processes V which are
adapted, Vo = 0, and which have right continuous paths of bounded variation
on compact intervals. We denote the total variation of a path between 0 and t
t
by IV(co)It= ~ IdV~(co)L.A process Yis said to be a semimartingale (for a measure P)
0
if it can be decomposed Y~= Yo+ Mt + Vt where M is a local martingale and V6 V.
The above decomposition is not unique. However, there exists at most one decomposition in which the process V ~ U is previsible [5] (also denoted "predictable"). Such a decomposition is called canonical, and semimartingales which
have a canonical decomposition are called special (see [16, p. 310] for their
properties). If a local martingale M is locally square integrable, there exists a
such that M t2 - ( M , M ) , is
unique increasing previsible process ( M , M ) t e •
Markov Solutions
41
a local martingale and has value 0 at t=0. The process (M, N ) is defined by
polarization.
Equality of processes will mean indistinguishability relative to a measure.
The letter Z will be reserved for Markov processes, and generally our notation
is that of Blumenthal and Getoor [2]. If Z = ( ~ , ~, ~,~,pz, Ot' Z, ) is a Hunt process, equality of processes will mean indistinguishability for every measure P~.
All of our Markov processes have state space IR, and ~, ~ * denote, respectively,
the Borel sets and the universally measurable sets on 1R. We use the notation of
Dellacherie for stochastic intervals: lIT, S)) = {(t, co): T(co) < t < S (~o)}. For a process Y we let Yt_(co)=lim Y~(~o),and we denote AYt(co)= Yt(co)- Y~_(co).
S~t
8<t
3. Universal Semimartingales
Throughout this section we will assume Z=(f2, ~ ~ , Zt, 0t, pz) is a Hunt process with state space IR [2, p. 45]. We let ~ ~
s<=t) and ~ # is the completion of the N ~ under P#, where # is a finite measure on N, the Borel sets of IR.
We l e t ~ =
(~ ~ # .
#finite
Definition. Let X be a process on f2. Then
(i) X is a complete semimartingale if X is a U semimartingale for every z.
(ii) X is a universal semimartingale if it is a complete semimartingale, and if
(3.1)
there is a decomposition
X~= Xo + M~ + V~
where M is a local martingale for every pz, and V t ~ .
(iii) X is a universally reducible semimartingale if it is a universal semimartingale and if there exist stopping times (T") tending to o9 which in a universal
decomposition reduce the local martingale term for every P=.
Suppose that X is a complete semimartingale. Define
(3.2)
Jr= ~ AX~l{lax,l>l},
s<=t
(3.3)
Yt=Xt-Jt .
Then I7,, has bounded jumps, so the process ( ~
A Ys2) 89is locally integrable,
O<s<t
and Yis hence a special semimartingale for every U [16, p. 3101. For a given P~
let
(3.4)
Y~= Yo + MT + B~
be its canonical decomposition, where M{ is a P~ local martingale, and B ~ e U
and is previsible.
(3.5) Definition. Let X be a process on f2 which is a complete semimartingale.
Let J, be as in (3.2), and Y~ as in (3.3). X is said to satisfy the stopping condition
42
P. Protter
if there exists a sequence of stopping times (T") tending to Go which does not
depend on z, and is such that for every z, E~(IBqrn)< oo, where B= is given in (3.4).
We now establish a useful lemma, which is an adaptation to our situation
of one due to Dol6ans-Dade [9]. We will need a notion of previsibility on the
space (f2, o~, ~ , p=), and results due to Sharpe [19]. We let 'A~ =PU--evanescent
processes (i.e., YeA/v if P~{3t: Yt4=0}=0). Let A / ' = ~ A / v be the evanescent
processes. Let ~, the previsible a-algebra, be the a-algebra on N+x O generated
by AVand the family of processes Y adapted to ~ and such that t-+ Yt(co) is left
continuous with right limits. If ~ " denotes the previsible a-algebra on IR+ x f2
relative to (Q, ~ " , Gu, P") for some finite #, then clearly NN ~ N " .
(3.6) Lemma. Let M be a universally reducible local martingale for all pz, and
fl >0. Then M can be decomposed into M = N + B , where N is a local martingale
for all pz and Be~. There exists a null set A (i.e., P"(A)=0 for all z) such that
IdNsl<fi for all seN+, off A.
Proof Following the technique of Dol6ans-Dade, we define stopping times
(S,J by:
S1 =inf{t:
IAM, I>B}
S,=inf{t: t > S , _ l , 14M, I>/~}.
Let C , = ~ A M s , l{t>=s,}, and let Tk be stopping times increasing to oo which unin
versally reduce M. Let R k= Tk A S k. Then C, ^ R~ is of integrable variation for all
P~. By the results of Sharpe [19], we know there exists a process C k e ~ such that
dk is the dual previsible projection (also known as the "compensator") for each
system ,(f2,o~,, ~"t, P"~,. Thus C , ~ R - - C tk is a local martingale for each P~. By
the uniqueness of the dual previsible projection, one can define C on [0, oo)) by
=0 k
on K0, R~]I.
Moreover since the filtration (o~) comes from a Hunt process, d has continuous
paths a.s. P", each #. Then the local martingale N + M - C + C
satisfies
[AGJ < IAM~ - A C~l + {A g; s(
={AM~--AC~I
<F.
Taking B = C - 0 completes the proof.
(3.7) Theorem. A complete semimartingale X satisfies the stopping condition if
and only if it is universally reducible.
Proof Necessity. Suppose X satisfies the stopping condition. Let J, be as in (3.2)
and Yt be as in (3.3). Since X is a complete semimartingale it has "cadlag" paths
(continue ~t droite, limites/~ gauche), and so Jte~. We wish to show Y, is universal. Choose K' large, let R = i n f { t : IY,[> K'}, and let K = K ' + 1. Then JYfl < K .
Let (T') be the stopping times tending to oo such that E~(B},)<oo, where
M a r k o v Solutions
43
B~= i IdA~[, and Y~=Yo+M[+A~ is the canonical decomposition. Let S=S"
o
= R A T ", so that S" tends to oo, and M[^ s is a local martingale such that
E~([M[^sl)< oo. Let Y be implicitly stopped at S. Then for G~b ~ , we have
~ [ ~ (~, ^ ~ - Ys ~ v ) ] = E~ [ ~ ( A ~ ~ - A~ ^ ~ ) ] ,
where the stopping times U k tending to oo reduce Mr^ s. Taking limits as k ~ co
yields
EZ[G(Y~ - Y~)]= E~[G(A{- A~)].
We conclude that z--* E =[G (A t -A~)] is N* measurable, for G ~b ~,~. A monotone
co
class argument shows that z ~ E ~ S HsdA ~ is M* measurable for any bounded
previsible integrand H.
o
We now use again the results of Sharpe [19]. We will show the existence of
a process A E ~ r~ U which is the dual previsible projection of A~ for each system
(~, ~ , ~ t z, U). Since the proofs of these results are omitted in [19], we will
provide those details which are needed here. Let G e b ~ ~ and H ' = G |
J
= G(co)~<~)
Then
by
Sharpe
[19]
there
exists
a
process
H
t
=
SH~'
such
that
H~
[0, tl"
is previsible and P"-indistinguishable from H', for all #. Let ~~176162
s=>O, f e ~ } and o~o* =~{f(Z~); s>O, f ~ * } . We define a kernel Q~ on g o o by
cO
Q~(G)=,W(G|
z
/_/
z
--
z
z
z
z
t), where p ( C ) - E ~C~dA s. Then Q~<P, and Q,
o
is a finite kernel, since Y (and hence A:) is stopped at S. Since ~oo is separable,
we may use Doob's lemma [15, p. 154] and conclude that there exists A~(z, co)
EN*|
~176
such that Qt(dco)=A'dz, co)P=(dco) as (signed) measures on ~,~oo.
Since Zo~Soo~ *, we have A~(co)=A't(Zo(co),co)e~,~~ and A't(co)=A~(z, co)
a.s. P=, for all z.
If Geb~, then given z we can find G~<G<G2, such that G~eb~ ~176
and
E ~(G 2 - G1) = 0. This allows us to show that E ~(A't G) = #= (G | lt0, ,j) for G e b 5 .
Define
At(co) = lim A'~(co)
as s decreases to t through Q. By the right continuity of each At, we have At(co)
=A~(co) a.s. P~, each z. For each z, A~ is P~ indistinguishable from A~; hence
~=M~+A~=M~+A,
and Y~-A,=M~ implies M{ does not depend on z. Because the canonical decomposition is unique, we can define A~ on ]]-0, ~)) by defining it on each stochastic interval [[0, S~. We let A t be equal on [[0, S]] to the process we obtain
for [[0, S]].
Sufficiency. Suppose X is universally reducible. Let
X,=Xo+M,+V,
be a universally reducible decomposition. Then by Lemma (3.6) we can write
44
P. Protter
X t = X o + Nt + C~+ V, where N and C are complete local martingales such that
IAN, I<I and C~'~. Let J,=yAXsl{tzxsl>__l> and
S~t
Y,=xo+N+(c,-4).
Then Yt has bounded jumps and so is special for every P=, and 1A(C-J)tr
< IArtl+ IANI-<_2. Thus C - a is universally locally of integrable variation, which
completes the proof.
We call a change of time a family ('c0~=>o of stopping times of the filtration
(a~t), such that for each coef2 the function ~.(c0) is right continuous, non-decreasing,
and finite. Kazamaki 1-13-1 has shown that semimartingales are preserved under
changes of time. If X t is a universal semimartingale and (%) is a change of time,
then X~t is a complete semimartingale, but it is not clear that it is still universal.
We do have however the following:
(3,8) Theorem. If X is a universally reducible semimartingale, and zt is a change
of time, then (X~,) is a universally reducible semimartingale for the system (f2, ~, ~ ) .
Proof. Let X , = X o + M , + B t be a universal decomposition and let (T') be stopping times increasing to Go which universally reduce the local martingale M,
Let Jr, = X~,, ~,~ = J~, etc. Then
2,= "2o+ ~4, + ~,.
It is easy to see that since Be~/r, so also ~s~/r. Thus it suffices to show that aT/,,
which is a complete semimartingale, is also a universally reducible semimartingale.
Let
8~t
Then Je~,, since ~/ is a complete semimartingale. Denote
M,~176
J [ = ~ A]fIsnl{la~r~l__>l} 9
S~t
It is easy to see that -~/t" is a uniformly integrable gt martingale. Let
& =inf{s: [z~{ > 1}
s~+~---inf{s: s>Sk; IA~21> 1}.
Then Sk are ~ stopping times which increase to oe and satisfy E=(IJ"ls~)<oc
for every z. Thus J" is universally of locally integrable variation. Let
T" =inf{u: % > T'}.
It is easy to check that 7"" is an ~ stopping time. Let
&=&-J,.
Then N has bounded jumps and so is a special semimartingale for every P~.
Markov Solutions
45
Note that/f/t=)~/7 on {t<T"}, so Jr"=Jt on {t< T"}. Moreover since
A
n
A
A n
M~A~'- - Mt + (M~, - M~,) l~t_>~-~,
we have
--
A n
~
(3.9) ~^~.=]~/,"-J," I ( , < ~ - . / + [ ( ) f / f , - M ~ . . ) - ( M ? , -
j~/n
~._) 1A] l{t>~,~ }
where A --={ ] M ~ c , - M f . ] = 1}. Then
(3.10)
^ - Mr.)
~ " - (M~.
^ - Mr._)
~"
[(M~.
1a [
=[(Mf._-
r
) 1A[
~n
<=IAM~.. I+ 1.
Rewrite (3.9) as
Then JF is universally locally integrable, so (3.10) implies that F,_."is universally
locall_y inte~grable. Hence using Sharpe's results [19] we know F t ' e N c ~ exists
and Ft"-Ft" is a local martingale for every pu. Hence
(3.11)
~ ~ ~. = (;/," + ; , ' -
~") + ~ "
is a semimartingale and the decomposition (3.11) is the canonical one. By the
uniqueness of the canonical decomposition, the decomposition (3.11) on [[0, T"]]
agrees with the analogous one on [[0, T"+~]] for any m > 0 when the latter is
restricted to [[0, T"]]. We thus achieve a decomposition on [[0, ~)). Hence
is a universal semimartingale, so _~/=~r+y is one also, and so J? is a universal
semimartingale, and the result is established.
(3.12) Example. Suppose the Hunt process Z of this section is actually a L6vy
process; that is, a process with stationary, independent increments. Let Jt and
Y~be as given in (3.2) and (3.3) respectively. Then Y, is again a L6vy process and
Y~ has finite mean. We have
z , = z o + [ ~ - U ~ (~)] + [Jr + U ~ (V,)]
where for each P= the first term in brackets is a martingale and the second term
is in ~, since the function t~EZ~
is a.s. (W) an affine function. Thus a L6vy
process is a universally reducible semimartingale, a fact pointed out, for example,
in Dol6ans-Dade and Meyer [7].
Let C t be a continuous additive functional [2, p. 148] of the L6vy process Z
such that if
zr =inf{s: C s > t}
then z r < oe for each t. The process (zt) is a change of time. We denote
46
P. Protter
Then X, is a strong Markov process [2, p. 212], where all of the obvious objects
are time-changed. Since Z is universally reducible, by Theorem (3.8) X is a universally reducible semimartingale.
4. The Markov Framework
Let Z=(f21,~,,~tt, Zt, Or, pz) be a Hunt process with state space IR. We let
~ ~ = a(Z~; s < t) and ~ " is the completion of ~ ~ under P". Let ~ = ~ " ,
where
the intersection is over all finite/~ on ~, the Borel sets of IR. In order to allow
more general initial conditions, we will need to consider a larger space. Define
(4.1)
f2 = IRx [21
px. == ex x P~
where ~x is point mass at x. Let m=(x, col) denote a point in f2 and define
(4.2)
Xo(co)=x,
when co=(x, col).
We let (~, = (~ fq~, where the intersection is over all finite v on N | N. We assume
N = V ~t. The process Z is defined on f21, and we extend it to (2 by Z(m)
t
=Z(col) l~a(x). By an additive functional of Z we will mean an adapted process
A on f21 which has right continuous paths of bounded variation such that Ao = 0,
and such that A satisfies for all s, t
At+s= At + As o 0 t
with equality meaning P= indistinguishability. We extend A to f2 by At(o~ ) = A , ( o l )
when co=(x, col).
The process Z extended to ~ is still a Hunt process, and the a-fields fft are
right continuous. Z is a semimartingale on f21 for P= if and only if it is a semimartingale on (2 for P*" ~ for all x. Throughout this section Z will be a Hunt process
and X o will be the random variable described in (4.2).
Suppose the Hunt process Z is a complete semimartingale (i.e., Z is a semimartingale for every P~" ~). Let f and g map IR+ x IR to IR and satisfy for all t~ IR+
and x, yelP,:
(4.3)
(i) If(t, x) - f ( t , Y)L + ]g(t, x ) - g(t, y)J < K J x - y ] ,
(ii) f and g are left continuous in t with finite right limits.
Then it is known [9, 17, 18] that a unique solution exists for each P~'= of
(4.4) XT'==Xo+ [.f(s, X2"9)dZ,+ g(s, XZ'_*)dA,
0
0
which, a priori, depends on P~" ~. The process (At) in (4.4) can be taken to be any
complete semimartingale, but we will only be interested here in the case where
A~ is an additive functional of Z.
Markov Solutions
47
Dol6ans-Dade [8] has shown that if Yis a complete semimartingale (for the
family PX' 0 and H is a locally bounded previsible integrand, then there exists
#
a version of i HsdY~ which does not depend on px,~. We will use a similar tech0
nique to show that X;~'z in (4.4) does not depend on PX'~. We first establish a
useful result.
(4.5) Lemma. Let 0 ~ = Wo, and
(4.6)
th"+l = Wo+
Sf~(s, q~_)
~ "dY~
i=1 0
where yi are P-semimartingales, and each fi satisfies (4.3). Let W~ be a solution of
i=i
0
Then rl~~ Wt in probability.
Proof We only treat the case m = 1. Suppose first that Yis a VS semimartingale;
that is, Y~= Yo + M~ + Bt, with M locally square integrable, and B previsible and
locally of square integrable total variation (see [17, 18]). Then it follows from the
proof of Theorem (3.1) in [18] that t / ~ W t in probability.
If Y has bounded jumps it is special (cf. [16]). Let Y~=Zo+Mt+B t be its
canonical decomposition. Select a t and e>0, and choose /~ so large that
P{S<=t} <e, where
S=inf{s: [BI~=>#}.
Then S is previsible. Let (S,) announce S, and choose v so large that P {T< t} <e,
where
T=S. Ainf{t: IMsl>=v}.
Then y r is very special. We have
P{lt/~- W,] >6} < P({It/7^ w - W~^TI > 3 } c~ {T>=t})+P(T<t)
_-<P{InL~- w,^ ~I>~} +~.
Thus t/~--+Wt in probability when Y has bounded jumps.
For arbitrary Y, let R=inf{s: IAY~[__>2}, where 2 is chosen so 'large that
P{R<t}<~. Let V~=~R=Y~I{~<m+YR l{s__>mSO that V is a semimartingale
with bounded jumps. Let y~= Wo, and
t
"+~ Wo+~f(s,~7_)av~
0
t
Ut= Wo + S f(s , U,_)dV~.
0
It is easy to check that 7~'= r/~ and Ut = Wt on [[0, R)). We have
48
P. Protter
P{[~7 - W~[> ~} ~ P({IY7 - U~]> (5} ~ {R ~ t}) + P(R < t)
and !irnP{]3,7-U~[>6}=O, since V = Y R has bounded jumps. This completes
the proof of the lemma.
(4.7) Theorem. Let f, g satisfy condition (4.3). Then there is a version of the solution X t '~ in (4.4) which does not depend on px, z.
Proof. Dol6ans-Dade [8] has shown that if previsible H is locally bounded and
Y is a complete semimartingale, then there exists a version of the stochastic
t
integral ~ H~ d Y~ which does not depend on W 'z. Induction shows that (r/7),__>o
0
do not depend on (x, z). Moreover t / 7 ~ X f in W-probability for every P". We
now use the technique of Mokobodzki ("rapid" filters [5, p. 45]). Assuming the
continuum hypothesis, Mokobodzki has shown the existence of a rapid filter
on N such that if one denotes
X~= lim inf t/7
then for each P", XtEf#"t and Xt=Xt" a.s. (P"). Thus Xt~fr c We define X~ as above
for rationals and let
Xt=limX ~
for seQ, s>t. Since Xt" has right continuous paths, X, and Xf are P" indistinguishable for each p. Since for each n ~?t does not depend on (x, z), we deduce X t
does not depend on (x, z). This completes the proof.
The next theorem makes use of Meyer's result on the local character of
stochastic integrals 1-16, p. 3083 to reveal an intuitively pleasing dependence of
the solution on the (random) initial value.
(4.8) Theorem. Let (f2, ~, P) be a complete probability space with a filtration
( ~ ) satisfying the "usual conditions" [5, p. 183]. Let yi, l < i < k be semimartingales, ]10/=0, and H, K be finite ~,~o-measurable random variables. Let V, W respectively solve
V,=H+
k
t
v_)dYZ
i=10
i=10
where fl satisfy conditions (4.3), 1 <i<k. Let A = {H=K}. Then a.s. on A, t ~ Vt(~)
and t ~ W~(og) agree.
Proof. Let I?to = H, #to = K, and
r/7+1 = H +
i'~fi(s, tl~_)dYj,
i ' I f,(s, #7_) dYZ
i=10
i=10
Markov Solutions
49
By the local character of the stochastic integral [16, p. 308] and a standard induction argument, t~t/~' agrees with t ~ p 7 a.s. on A. Since t/~' and #7 tend respectively in probability to Vt and Wt by Lemma (4.5), we have Vt = Wt a.s. on A. Since
V~ and W~ have right continuous paths, the result follows.
We are now in a position to record some trivial but useful relationships among
the measures pz and px, ~. Let Z be the Hunt process and X o the random variable
described at the beginning of this section. Let Xtx and X t respectively solve
t
(4.9)
t
X { = x + ~f(s, X~_) dZs+ ~ g(s, XX) dA~,
0
0
t
(4.10)
t
Xt=Xo+yf(s, Xs_)dZs+yg(s,X~_)dA ~
0
0
where f and g satisfy (4.3), Z is a Hunt process and a complete semimartingale
and A~YC.
(4.11) Proposition. Let (X[) and (Xt) be as in (4.9) and (4.10). Let H~bo~, and
/4(0)) = H(col) lia(x), where co = (x, col). Then
(i) X, and Xtx are P~'Z indistinguishable for all z.
(ii) For any f s b ~ J |
(iii) E x' ~[~ I~,] - - E = [u
E~'~[f(Xt,/4)] =EZ[f(X[,/4)].
I~,]
1~.
Proof. Part (i) is an application of Theorem (4.8). (No problems are caused by
the lack of completeness of each f#,; results are to be interpreted as "up to evanescence"). Part (ii) follows from part (i) and a monotone class argument, and part (iii)
is clear, given part (ii).
5. Markov Solutions
A diffusion D t can be defined as a strong Markov process with continuous paths.
If one requires conditions on the conditioned increments so that they are approximately Gaussian, then one can express D t as the solution of an It6 type
stochastic differential equation
t
(5.1)
Dt=Do+~f(s,D~)dW~+ig(s, Ds)ds
0
0
where Wt is the Wiener process. (See Gihman and Skorhod [10, p. 70].)
If one considers a model in which the continuity of the paths is not essential,
one can consider Markov processes other than Brownian motion, and random
measures, as differentials. Let f, g satisfy conditions (4.3), Z be (say) a Hunt process which is a complete semimartingale relative to W 'z (see Sections 3 and 4 for
definitions) and A t an additive functional of Z. Let X o be as in (4.2), and let X t
solve
t
(5.2)
X, = X o + ~ f(s, X~_) dZ~ + i g (s, X~_) dA~.
0
0
Then one might hope that X t would be a Markov process. This is not true in general,
50
P. Protter
as simple examples show. (Use a Markov chain, so that X, becomes the solution
of a difference equation, and extend to continuous time.)
Processes with independent increments need not be semimartingales. Indeed,
as is pointed out in [16, p. 298], if Z t has independent increments,
Yt=Zt - ~ AZs l(l~z~l->l},
S<=t
then Z is a semimartingale if and only if the function t ~ E [ Y t ] is of bounded
variation. Using the notation established at the beginning of Section 4, we Obtain
the following extension of It6's classical result:
(5.3) Theorem. Let Z have independent increments, Z o = 0 , and be a semimartingale. Let f and g satisfy conditions (4.3). Let X o be as in (4.2) and let X t be a solution
of
t
(5.4)
t
X t = X o + ~f(s, Xs_ ) dZs+ ~ g(s, Xs) ds.
o
o
Then X, is a strong Markov process.
I f Z is a Ldvy process and f and g are autonomous (i.e., f(t, x)=f(x), g(t, x)
--g(x)), then X t is a (time-homogeneous) strong Markov process, with its transition
semigroup given by
Pt h (x) = E x" o [h (Xt) ] .
Proof Let T be a stopping time, and d F T = a { Z T + , - - Z r ; U>0}. Then aft T is a
a-algebra on ~1 and j f r is independent of ~T under pO (cf. [3]). Let ~~ s ) = x ;
define X(x, t, s) and for s > t inductively define ~/"(x, s) by
(5.5)
X(x,t,s)=x+Sf(u,X(x,t,u-))dZ,+ig(u,X(x,t,u-))du
t
q"+ l(x, s ) = x + ~ f(u, tl"(X, u - ) ) dZ.+ i g(u, tl"(x, u - ) ) du.
t
t
Since q" is a semimartingale it has cadlag paths; hence (as is easy to check)
q'+ l(x, s)= lim ~ f(ui, tl"(x, ul))(Z.,+ 1- Z . ~ ) +
~i~
~ g(u,, tl"(x, ui))(ui+l -uO
Ui~
TM
m
where the convergence is in P~
and the limit is taken as mesh (N")--+ 0,
where ~ " are partitions of (t, s]. An inductive argument shows t/"e~/f t, and
Lemma (4.5) shows X(x, t, s)eYf ~. By the uniqueness of the solutions (see [9, 18]),
one can show X ~ = X ( X ( x , O , T), T, S) for stopping times S, T with S > T. If X t
is the solution of (5.4), we write X t = X ( X o , O, t) and also X t = X ( x , O, t). By the
independence of g r and air T and using Proposition (4.11) we have for any h e b N
and stopping times S____T,
(5.6)
EX'~176
I~R
= ~o Eh(X(XL T,
=j(X~) 1~
S)) Ig~3 l~
Markov Solutions
51
where j(y) = E ~ [h (X(y, T, S))] = E y' o [h (X(Xo, T, S))], and the last equality above
is a consequence of an elementary lemma in Gihman and Skorohod [10, p. 67].
We finally observe that under px, o we have
(5.7) j(X~r) IIR=j(Xr).
Suppose now that f and g are autonomous, Z is a L6vy process, and X t is
a solution of (5.4). It is well known that for a L6vy process Z, the process Zr+ s - Z T
is identical in law to Z s (cf. [3]). It is then easy to check that X(x, T, T+ u), u > 0
is independent of @r and is identical in law (under po) to X~', u>0. By (5.6) and
(5.7) we have
U , o [h (Xs) l~r] = j (XT),
but in this case we have
j(y)= E ~ [h(X (y, T, S))]
= E ~[h(XL
0]
= E" o [h (Xs_ r)]
where the second equality above is due to the identification in law of X~ and
X(x, T, T+ u). This completes the proof of Theorem (5.3).
In Theorem (5.3) we assumed the differential Z had independent increments
and were able to conclude the solution X of (5.4) was a strong Markov process.
If we weaken the conditions on Z so that it is merely a strong Markov process,
the solution need not be Markov. However, the vector process (X, Z) is a strong
Markov process.
(5.8) Theorem. Let Z=(~2, Jd, d/Z~,Z~,P) be a (strong) Markov process and a
semimartingale. Let f and g satisfy conditions (4.1) and let X t be a solution of
t
t
X~=Xo + S f(s, Xs_) dZs+ ~ g(s, Xs_ ) ds
0
0
where X o ~ Jgo . Then the vector process (X, Z) is (strong) M arkov for ((2, J{, d/lt , P ).
Proof Let X(x, t, s) and t/"+ l(x, s) be as given in (5.5). Then the results of Dol6ans [6]
and an inductive argument show that (x, t, co)-+t/"(x, t, co) is jointly measurable
for each n. Since q"(x, t, co)-+X(x, t, co) in P-probability for each x by Lemma (4.5),
- K v (t/" A K) converges in a(L 1,15~ to - K v (X A K) for each K. An application of Doob's lemma [15, p. 154] yields that (x, t, co)-+X(x, t, co) is jointly
~, where N is the Borel sets
measurable. Indeed, this yields X(x,t, c o ) 6 N |
on IR and 3W=a{Zt+u-Z,;u>=O }. By the uniqueness of the solutions, one
easily checks that for stopping times S>=T, X s = X ( X T , T,S ). Let h e b ~ and
K e b W ~. Then
E {h(Xt) KlJCZt} =h(Xt) E {KI/dt}
=h(X~)E{KIZ,}
=j(X. Z O.
52
P. Protter
Therefore E { h ( X t ) K L ~ t } = E { h ( X t ) K t X t , Zt}. If Z is assumed to be strong
Markov, the preceding holds for stopping times S, T. The theorem now follows
by an application of the monotone class theorem.
We now state our main result. Observe that time changed L~vy processes
such as those described in example (3.11) satisfy the conditions imposed on the
process Z in the following theorem.
(5.9) Theorem. Let Z be a Hunt process and a universally reducible semimartingale.
Let A be an additive functional of Z. Let autonomous f and g satisfy conditions
(4.3), X o be as given in (4.2), and let X t be the solution of
t
(5.10)
t
X t = X o + S f ( X s _ ) d Z , + S g ( X s _ ) d A s.
0
0
Then the vector process (X, Z) is strong Markov, with transition semigroup Pt h(x, z)
= E x, z [h(Xt, Zt)].
Before proving this result, we establish some notation and a lemma. For
fixed u, let ~/, = M t o O, for a process M. Let ~ = 0 g l ( ~ ) . Following Meyer, we
i
let C. Y denote the stochastic integral ~ C s d Ys for a semimartingale Y. The
0
following lemma is used in the proof of Theorem (5.9).
(5.11) Lemma. Let Y be a universally reducible semimartingale. Let C be a previsible integrand which is universally locally bounded. Then C . f'= ~'---~, for any
fixed u.
Proof. Let Yt= Y o + M t + B t be a universal decomposition and (T n) stopping
times tending to oo such that M Tn is a W martingale for each n. Implicitly stopping
Y at T n for some fixed n, by Lemma (3.6) we can write
(5.12)
M=N+B
where N is a (universally) locally bounded martingale, B6~,, and N 0 = B o = 0 .
Let G ~ ,
where G = H o 0 u, H s ~ . By stopping N if necessary we assume
without loss of generality that N is bounded. Then
EZ [ ( ~ - Ns) G] = EZ [EZ~ (N, - N~) H] = 0,
consequently N is an ~ martingale. If M is a square integrable ~t martingale
we have
E ~[(~/~~ - (M, n)~) G]
= E ~[E z~ [(M t N t - ( M , n ) t ) HI]
= E ~[E z~ [(M s Ns - (M, n ) s ) H I ]
= E [ ( M ~ N s - ( M , N ) s ) G]
and if ( M , N)t is ~-previsible, by the uniqueness of ( - , . ) we can conclude
(5.13)
(f/I, J V ) t = ( M , N ) ~.
M a r k o v Solutions
53
Let N ( ~ ) denote the previsible a-algebra for a filtration (J~). Let ~f~ = { Y~ b ~'(~):
!?sbN(~)}. Then ~ clearly contains the left-continuous and ~-adapted processes, and therefore a monotone class argument shows that shifting preserves
previsibility. For a process B e ~ , the statement C . / ) = ~
is merely notation.
For N locally bounded, using (5.13) we have
(5.14)
(C.N-C.N,
4.N-C.N)
= ( 4 ) 2. (_N, ]V)-2(77. (~r, C ~ ) + ( C
=(C) 2. (N,"~) - 2 C. (
~
.~"N, C ."-~)
+ ( ~ )
=(d)2"(N,N)-24"C'(N,N)+(C2).(
)
~0.
Since C. *No- C ' ~ o =0, (5.14) implies that C. ~r= ~'7~. Using the decomposition (5.12) we have
~'-~=C.
N + C. B = e .
N + d. B=e.
M
and the lemma is proved.
Proof of Theorem (5.9). We define X(x, t, s) and inductively define #"(x, t, s) by
#o (x, t, s) = x and for s > t,
(5.15) #"+l(x, t, s ) = x + i f(#"(x, t, u - ) d Z , +
i g(#"(x, t, u - ) ) dA,
t
t
s
s
X(x, t, s) = x + j f ( x , t, u - ) ) dZ, + ~ g(X(x, t, u -)) dA u.
t
t
We also write X(x, t) for X(x, O, t) and g"(x, t) for/~"(x, 0, t). Observe that
#1 (X, t, S) = X -~ f (X)(Z S -- Z,) + g (x)(A, - At)
= (x + f ( x ) ( Z s_, - Zo) + g (x)(A s_, - a o)) ~ O,
= #1 (x, s - t) o 0,.
Assume #"(x, t, s)=#"(x, s - t ) o 01. Then
S
fin + I(X ' t, S) = X -~- ~ f(~"(x,
(u - t) - ) d2._t + i g (/~'(x, (u - t) - ) dA._,
t
t
S--t
=x+
=
8--t
S f(~"(x,u-))d2,+
~
0
0
+
f(#"(x,u-))dZ,+
0
g(~"(x,u-))d2u
f g(l~"(x,u-))dA
o0,
0
where the last equality uses Lemma(5.11). Induction shows then that for all n
(5.16) t~:'(x,t, s)=#"(x, s - t ) o 0 t.
54
P. Protter
We next establish the equality
(5.17)
E~{h(X(x, t, s), Zs)l~}-=EZ~{h(X(x, s - t ) , Z~_,)}
for h s b N |
First, consider h of the form h ( x , y ) = h i ( x ) h2(y), with h i continuous with compact support. By Lemma (4.5) and the uniform continuity of hi,
hl(#n(x, t, s))--*hl(X(x, t, s)) in the mean. Using (5.16), we have
E z {h a(X(x, t, s)) h 2 (Z~) I~ } = lim E z{h~(#"(x, t, s) h 2 (Z~)I~ }
n~o~
= lira E ~{h t (#"(x, s - t)) h 2 (Z~_ t) ~ Ot [4}
= lira E z~{h1(fin(x, s - t)) h 2 (Z~_ t)}
n~oo
=EZ'{hl(X(x, s, t)) h2 (Z~_,) } .
A monotone class argument now yields (5.17). Note that (5.17) also holds for
stopping times S > T.
Let (Xt) be as given in (5.10), and fix a measure W' ~. Let X~ denote the solution
of
(5.15) x2=x+~f(X~_)dZ.+
'
i g(XL)dA.
0
0
for the law W on f2a. Let h e b N |
we have
(5.19)
F e b ~ ~ and k s b N . Using Proposition (4.11)
E~'~[h(X~,Zs) Fk(Xo) ]
= E~[h(X2, Z~) F] k(x)
= E ~ [h(X(X~, t, s), Z~) F] k(x)
by the uniqueness of the solutions. As was shown in the proof of Theorem(5.8),
X~ is jointly measurable in (x, t, co). A monotone class argument then yields
(5.20)
E~[h(X~, t, s), Z~) F] k(x)
= E ~ [E ~ [h (X~, t, s), Z~)l~,~t~ f ] k (x)
= E ~ [E =[h (X(y, k, s), Z,)] ~t ~ ly=x~ F] k (x)
=E~[EXf'~th(X(Xo, t, s), Z~)].~] F] k ( x )
= E ~'~ [E x',~ [h (X(Xo, t, s), Z~)I ~f] F k (Xo)] .
Together (5.19) and (5.20) establish that
(5.2~) E ~' ~[h (X~, Z~)[ (q,] = E x~'~[h (X(Xo, t, s), Zs)] ~qt~
Let
(5.22) j (y) = E" ~[h (X(Xo, t, s), Z~)l ~7]Then j(y) is also a version of E~[h(X(y, t, s), Z s ) l ~ ] , and so
(5.23) j ( y ) = E Z ' [ h ( X ( y , s - t ) , Z s _ t ) ]
=E"Z*[h(X~_t, Zs_,)]
Markov Solutions
55
where we have used (5.17) and (4.11). Combining (5.21), (5.22), and (5.23) yields
E ~, =[h (X~, Zs) l fr
= E x~'z~ [h(X~_,, Zs_,) ] .
To show that (X, Z) is strong Markov it suffices to show E x' ~[h(Xr+s, ZT+ s)/NT]
=EX'~[h(XT+~,ZT+s)]XT,ZT] for any stopping time T, and s>0. The proof
of (5.17) is valid for stopping times. For h E b ~ |
we have
(5.24)
Ex'=[h(XT+~, ZT+s)[~T}
= E z {h (X~. +~, ZT+ s) I@T}
=EZ{h(X(X~r, T, T+ s), ZT +~)I2T}.
For a fixed W we know that X(x, T, T+ s, co), the solution relative to (O, J~, ff,~,W),
is jointly measurable; it suffices to observe that for ha, h 2 ~bN we have
E ~{hi (X~r) h a (X(y, T, T+ s), ZT+ ~)I~T}
= h~ (X~) EZ~'{h2 (X(y, s), Z~)}
=j(s, X T, ZT)
a.s., W' ~.
This completes the proof of Theorem (5.9).
6. The L6vy System
In this section we assume Z = (01, ~ ~t, Zt, Or, P~) is a Hunt process and a universally reducible semimartingale (see w3 for definitions). We will reserve A t
to denote a quasileft-continuous additive functional of Z. (That is, for any sequence
(T,) of stopping times increasing to T, AT, ~ A T a.s.).
Let f, g: IR-~IR satisfy conditions (4.3), let X 0 be as given in (4.2), and let X t
be the solution of
t
(6.1) X t = X o + i f ( X s _ ) d Z s + ~ g ( X s _ ) d A s .
0
0
By Theorem (5.9), the process (X,Z) is a time-homogeneous strong Markov
process with semigroup Pt h(x, z ) = E ~'z [h(X,, Zt) ], where the measures W "z are
as given in (4.1). Since (At) is quasi-left-continuous, X t and hence (X, Z) are also
quasi-left-continuous.
As we have seen in w we can define O=]R• O1, ~ t ~ 1 7 4
~ etc. Let o~60,
where c,=(x, co), with x~lR and coco 1. For each t61R+ we define
(6.2)
0, (c,) = (X, (o~), 0,(c~)).
One easily checks that for H~bfr
(6.3) EX'Z[HoO.]=EXu'Z"[H].
Equation (6.3) allows the generalization of Lemma (5.11), which in turn gives
that for each fixed s
56
P. P r o t t e r
are indistinguishable as processes in t (the null sets depend on s). A "perfection
argument" in the style of Walsh [20] shows that there exists a process 2f which
is indistinguishable from X and for o) not in fl (with pu(fl)=0 for every probability
/~ on ~2),
X~,=k~_~ o 0~(~).
We define ~ t ~=o-((2(~, Zs): s < t), j r o= ~/~ut~o' and ~tt =
s
0
~ t u where .~t ~
#finite
denotes the completion of Yft~ under P", with/~ a probability on B 2, the Borel
sets of IR2. We conclude that (X, Z) = (O, ~,, ~ , (X~, Zt)2 Or, px, ~) is a Hunt process.
For the rest of this section we will write (X~) for (Xt).
Cinlar [4] and Jacod [11, 12] have considered processes which are similar
in structure to the process (X, Z). Indeed, the process (X, Z) is a Markov Additive
Process in the sense of Cinlar [4, p. 86]. To see this let F, G ~ , the Borel sets
of IR. Let Qt (z, F x G) = E ~ ~[ 1v (Xt) 1G(Zt)]. Let X(x, t) = X(x, O, t) be the solution
of (6.1) starting at x (which is defined rigorously by (5.15)), and one easily checks
that X(O,t) is V indistinguishable from X ( x , t ) - x . Thus Q t ( z ; ( F - x ) x G )
=Pdx, z; F x G). One can also easily check that the process ( X , Z ) is a Semidirect Markov Process Product in the sense of Jacod [11, 12].
A pair (K, H) is said to be a Lhvy system of the Hunt process Z if K(z; dz')
is a kernel on lRx IR such that K(z, {z})=0 for every zelR, and H is a continuous
additive functional of Z such that for any nonnegative Borel function F on
IR x IR we have
s<_t
=E z
dH s K(Zs, dz')F(Zs, z' .
0
Every Hunt process has a L6vy system [1]. Jacod [12] has related a L6vy system
of a semi-direct Markov process product (Y, Z) to a L6vy system of the (say)
Hunt process Z. In our situation we can obtain a more explicit relationship by
expressing a L6vy system of (X, Z) in terms of the coefficients f, g; the jumps of
(A~); and a L6vy system of Z.
Let Bt= 2 AAs, where A~ is the quasi-left-continuous additive functional
sK=t
of (6.1). Then Motoo's theorem (see, e.g., [1]) states that there exists a Borel
function h on lRx IR such that Bt is equivalent to the AF
(6.4) B,=B;= ~ h(Zs_,Z~)
set
sEJ
where equality means up to indistinguishability and J = {(s, o)): Zs_(m)+Zs(m)}.
(6.5) Theorem. Let Z, X, and A be as given at the beginning of this section, and
let h be as given in (6.4). Let (K, H) be a L~vy system for Z. Then (N, H) is a L~vy
system for (X, Z), Where N(x, z; dx'• dz')=K(z; dz') ek(..... ,)(dx') and k(x, z, z')
= x + f ( x ) ( z ' - z ) + g ( x ) h ( z , z ' ) and ~a denotes point mass at a.
Markov Solutions
57
Proof Recall that 0~ as given in (6.2) is the shift for (X, Z). Extend H , the AF
of Z, to (2 by Ht(x, co)= Ht(co). Then H is also an AF of (X, Z).
Let Y~ be a nonnegative previsible process and let F be nonnegative Borel
on 1R2. It is well known that ~ dH s ~ K(Zs, dz')F(Zs, z') is the dual previsible
projection of ~ F(Z~_, Zs). Thus
s~J
s<=t,s~J
Let Y~= W~G(Xs_) where W~ is nonnegative previsible, and GeN+. A monotone
class argument then yields
(6.6) EX'={ ~,
W~F(Xs_,Zs_,Z~))
s~t,s~J
for nonnegative Borel F on 1R3. From properties of the stochastic integral [-16,
p. 300] we have (assuming h vanishes on the diagonal of 11t2)
(6.7)
X t = X t_ +f(Xt_)AZt+g(X~_)AA ,
= Xt_ + f(Xt_) AZt + g(X,_) h(Zt_, Zt)
= k(X,_,
Z,).
Let nonnegative Borel F be defined on IR4. Equations (6.6) and (6.7) imply
EX'Z{ ~
WJ(X~_,X~,.Z~,Z~_)}
s<=t,s~J
=E x'=
W~ K(Z~,dz')F(Xs,k(X~_,Z~,z'),Z~,z')dH s
0
= E x,~
WsSN(X~,Zs;dx'dz')F(Xs,x',Z~,z')dH ~ .
This completes the proof.
References
1. Benveniste, A., Jacod, J.: Syst6mes de L6vy des processus de Markov. Inventiones math. 21,
183-198 (1973)
2. Blumenthal, R. M., Getoor, R.K.: Markov Processes and Potential Theory. New York: Academic
Press 1968
3. Bretagnolle, J.L.: Processus/~ Accroissements Ind6pendants. Lecture Notes in Math. 307. BerlinHeidelberg-New York: Springer 1973
4. Cinlar, E.: Markov Additive Processes I. Z. Wahrscheinlichkeitstheorie verw. Gebiete 24, 85-93
(1972)
58
P. Protter
5. Dellacherie, C., Meyer, P.A.: Probabilit6s et Potentiel, edition entierement refondue. Paris:
Hermann 1975
6. Dol6ans, C.: Int6grales Stochastiques D6pendant d'un Param~tre. Publ. Inst. Statist. Univ. Paris
16, 23-35 (1967)
7. Dol6ans-Dade, C., Meyer, P.A.: Int6gr~iles Stochastiques par rapport aux martingales locales.
Lecture Notes in Math. 124, 77-107. Berlin-Heidelberg-New York: Springer 1970
8. Dol6ans-Dade, C.: Int~grales Stochastiques par rapport ~ une famille de probabilit6s. Lecture
Notes in Math. 191, 141-146. Berlin-Heidelberg-New York: Springer 1971
9. Dol6ans-Dade, C.: On the Existence and Unicity of Solutions of Stochastic Integral Equations.
Z. Wahrscheinlichkeitstheorie verw. Gebiete 36, 93-101 (1976)
10. Gihman~ I.I., Skorohod, A.V.: Stochastic Differential Equations. Berlin-Heidelberg-New York:
Springer 1972
11. Jacod, J.: Noyaux Multiplicatifs d'un Processus de Markov. Bull. Soc. Math. France, Memoire
35, 81-117 (1973)
12. Jacod, J.: Fonctionelles Additives et Syst6mes de L6vy des Produits Semi-directs de Processus de
Markov. Bull. Soc. Math. France, Memoire 35, 119-144 (1973)
13. Kazamaki, N.: Changes of time, Stochastic Integrals, and Weak Martingales. Z. Wahrscheinlichkeitstheorie verw. Gebiete 22, 25-32. Berlin-Heidelberg-New York: Springer 1972
14. Kunita, H., Watanabe, S.: On Square Integrable Martingales. Nagoya Math. J. 30, 209-245 (1967)
15. Meyer, P.A.: Probability and Potentials. Toronto: Blaisdell 1966
16. Meyer, P.A.: Un Cours sur les Int6grales Stochastiques. Lecture Notes in Math. 511, 245-400.
Berlin-Heidelberg-New York: Springer 1976
17. Protter, P.E.: On the Existence, Uniqueness, Convergence, and Explosions of Solutions of Systems of Stochastic Integral Equations. Ann. Probability 5, 243-261 (1977)
18. Protter, P.E.: Right Continuous Solutions of Systems of Stochastic Integral Equations. J. of
Multivariate Anal. 7, 204-214 (1977)
19. Sharpe, M.J.: Homogeneous Extensions of Random Measures. Lecture Notes in Math. 465,
496-514. Berlin-Heidelberg-New York: Springer 1975
20. Walsh, J.B.: The Perfection of Multiplicative Functionals. Lecture Notes in Math. 258, 233-242.
Berlin-Heidelberg-New York: Springer 1972
Received February 12, 1977: in revised form May 2, 1977