Stochastic processes Lecture notes

stochastic processes and their applications lecture notes, what is stochastic process with real life examples, stochastic processes detection and estimation lecture notes
Prof.EvanBaros Profile Pic
Prof.EvanBaros,United Kingdom,Teacher
Published Date:26-07-2017
Your Website URL(Optional)
Comment
Stochastic Calculus Alan Bain1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the It o integral and some of its applications. They owe a great deal to Dan Crisan's Stochastic Calculus and Applications lectures of 1998; and also much to various books especially those of L. C. G. Rogers and D. Williams, and Dellacherie and Meyer's multi volume series `Probabilities et Potentiel'. They have also bene ted from insights gained by attending lectures given by T. Kurtz. The present notes grew out of a set of typed notes which I produced when revising for the Cambridge, Part III course; combining the printed notes and my own handwritten notes into a consistent text. I've subsequently expanded them inserting some extra proofs from a great variety of sources. The notes principally concentrate on the parts of the course which I found hard; thus there is often little or no comment on more standard matters; as a secondary goal they aim to present the results in a form which can be readily extended Due to their evolution, they have taken a very informal style; in some ways I hope this may make them easier to read. The addition of coverage of discontinuous processes was motivated by my interest in the subject, and much insight gained from reading the excellent book of J. Jacod and A. N. Shiryaev. The goal of the notes in their current form is to present a fairly clear approach to the It o integral with respect to continuous semimartingales but without any attempt at maximal detail. The various alternative approaches to this subject which can be found in books tend to divide into those presenting the integral directed entirely at Brownian Motion, and those who wish to prove results in complete generality for a semimartingale. Here at all points clarity has hopefully been the main goal here, rather than completeness; although secretly the approach aims to be readily extended to the discontinuous theory. I make no apology for proofs which spell out every minute detail, since on a rst look at the subject the purpose of some of the steps in a proof often seems elusive. I'd especially like to convince the reader that the It o integral isn't that much harder in concept than the Lebesgue Integral with which we are all familiar. The motivating principle is to try and explain every detail, no matter how trivial it may seem once the subject has been understood Passages enclosed in boxes are intended to be viewed as digressions from the main text; usually describing an alternative approach, or giving an informal description of what is going on feel free to skip these sections if you nd them unhelpful. In revising these notes I have resisted the temptation to alter the original structure of the development of the It o integral (although I have corrected unintentional mistakes), since I suspect the more concise proofs which I would favour today would not be helpful on a rst approach to the subject. These notes contain errors with probability one. I always welcome people telling me about the errors because then I can x them I can be readily contacted by email as alanbchiark.greenend.org.uk. Also suggestions for improvements or other additions are welcome. Alan Bain i2. Contents 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . i 2. Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii 3. Stochastic Processes . . . . . . . . . . . . . . . . . . . . . . . 1 3.1. Probability Space . . . . . . . . . . . . . . . . . . . . . . . 1 3.2. Stochastic Process . . . . . . . . . . . . . . . . . . . . . . 1 4. Martingales . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 4.1. Stopping Times . . . . . . . . . . . . . . . . . . . . . . . 4 5. Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 5.1. Local Martingales . . . . . . . . . . . . . . . . . . . . . . 8 5.2. Local Martingales which are not Martingales . . . . . . . . . . . 9 6. Total Variation and the Stieltjes Integral . . . . . . . . . . . . . 11 6.1. Why we need a Stochastic Integral . . . . . . . . . . . . . . . 11 6.2. Previsibility . . . . . . . . . . . . . . . . . . . . . . . . . 12 6.3. Lebesgue-Stieltjes Integral . . . . . . . . . . . . . . . . . . . 13 7. The Integral . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 7.1. Elementary Processes . . . . . . . . . . . . . . . . . . . . . 15 7.2. Strictly Simple and Simple Processes . . . . . . . . . . . . . . 15 8. The Stochastic Integral . . . . . . . . . . . . . . . . . . . . . 17 8.1. Integral for H2L and M2M . . . . . . . . . . . . . . . . 17 2 8.2. Quadratic Variation . . . . . . . . . . . . . . . . . . . . . 19 8.3. Covariation . . . . . . . . . . . . . . . . . . . . . . . . . 22 2 8.4. Extension of the Integral to L (M) . . . . . . . . . . . . . . . 23 8.5. Localisation . . . . . . . . . . . . . . . . . . . . . . . . . 26 8.6. Some Important Results . . . . . . . . . . . . . . . . . . . . 27 9. Semimartingales . . . . . . . . . . . . . . . . . . . . . . . . . 29 10. Relations to Sums . . . . . . . . . . . . . . . . . . . . . . . 31 10.1. The UCP topology . . . . . . . . . . . . . . . . . . . . . 31 10.2. Approximation via Riemann Sums . . . . . . . . . . . . . . . 32 11. It o's Formula . . . . . . . . . . . . . . . . . . . . . . . . . . 35 11.1. Applications of It o's Formula . . . . . . . . . . . . . . . . . 40 11.2. Exponential Martingales . . . . . . . . . . . . . . . . . . . 41 12. L evy Characterisation of Brownian Motion . . . . . . . . . . . 46 13. Time Change of Brownian Motion . . . . . . . . . . . . . . . 48 13.1. Gaussian Martingales . . . . . . . . . . . . . . . . . . . . 49 14. Girsanov's Theorem . . . . . . . . . . . . . . . . . . . . . . 51 14.1. Change of measure . . . . . . . . . . . . . . . . . . . . . 51 15. Brownian Martingale Representation Theorem . . . . . . . . . 53 16. Stochastic Di erential Equations . . . . . . . . . . . . . . . . 56 17. Relations to Second Order PDEs . . . . . . . . . . . . . . . . 61 17.1. In nitesimal Generator . . . . . . . . . . . . . . . . . . . . 61 17.2. The Dirichlet Problem . . . . . . . . . . . . . . . . . . . . 62 iiContents iii 17.3. The Cauchy Problem . . . . . . . . . . . . . . . . . . . . 64 17.4. Feynman-Ka c Representation . . . . . . . . . . . . . . . . . 66 18. Stochastic Filtering . . . . . . . . . . . . . . . . . . . . . . . 69 18.1. Signal Process . . . . . . . . . . . . . . . . . . . . . . . 69 18.2. Observation Process . . . . . . . . . . . . . . . . . . . . . 70 18.3. The Filtering Problem . . . . . . . . . . . . . . . . . . . . 70 18.4. Change of Measure . . . . . . . . . . . . . . . . . . . . . 70 18.5. The Unnormalised Conditional Distribution . . . . . . . . . . . 76 18.6. The Zakai Equation . . . . . . . . . . . . . . . . . . . . . 78 18.7. Kushner-Stratonowich Equation . . . . . . . . . . . . . . . . 86 19. Gronwall's Inequality . . . . . . . . . . . . . . . . . . . . . . 87 20. Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . 89 20.1. Conditional Mean . . . . . . . . . . . . . . . . . . . . . . 89 20.2. Conditional Covariance . . . . . . . . . . . . . . . . . . . . 90 21. Discontinuous Stochastic Calculus . . . . . . . . . . . . . . . 92 21.1. Compensators . . . . . . . . . . . . . . . . . . . . . . . . 92 21.2. RCLL processes revisited . . . . . . . . . . . . . . . . . . . 94 22. References . . . . . . . . . . . . . . . . . . . . . . . . . . . 963. Stochastic Processes The following notes are a summary of important de nitions and results from the theory of stochastic processes, proofs may be found in the usual books for example Durrett, 1996. 3.1. Probability Space Let ( ;F;P) be a probability space. The set ofP-null subsets of is de ned by N :=fN :NA for A2F; withP(A) = 0g: The space ( ;F;P) is said to be complete if for A B with B2F andP(B) = 0 then this implies that A2F. In addition to the probability space ( ;F;P), let (E;E) be a measurable space, called n the state space, which in many of the cases considered here will be (R;B), or (R ;B). A random variable is aF/E measurable function X : E. 3.2. Stochastic Process Given a probability space ( ;F;P) and a measurable state space (E;E), a stochastic process is a family (X ) such that X is an E valued random variable for each time t t0 t + + + t 0. More formally, a map X : (R  ;B F) (R;B), whereB are the Borel sets + of the time spaceR . De nition 1. Measurable Process + + The process (X ) is said to be measurable if the mapping (R  ;B F) (R;B) : t t0 (t;)7X () is measurable onR with respect to the product - eldB(R) F. t Associated with a process is a ltration, an increasing chain of -algebras i.e. F F if 0st1: s t De neF by 1 0 1 _ A F = F := F : 1 t t t0 t0 If (X ) is a stochastic process, then the natural ltration of (X ) is given by t t0 t t0 X F :=(X :st): s t The process (X ) is said to be (F ) adapted, if X isF measurable for each t 0. t t0 t t0 t t The process (X ) is obviously adapted with respect to the natural ltration. t t0 1Stochastic Processes 2 De nition 2. Progressively Measurable Process A process is progressively measurable if for each t its restriction to the time interval 0;t, is measurable with respect toB F , whereB is the Borel  algebra of subsets of 0;t t 0;t 0;t. Why on earth is this useful? Consider a non-continuous stochastic process X . From t the de nition of a stochastic process for eacht thatX 2F . Now de neY = sup X . t t t s s20;t IsY a stochastic process? The answer is not necessarily sigma elds are only guaranteed s closed under countable unions, and an event such as fY 1g = fX 1g s s 0ss is an uncountable union. If X were progressively measurable then this would be sucient to imply thatY isF measurable. IfX has suitable continuity properties, we can restrict s s the unions which cause problems to be over some dense subset (say the rationals) and this solves the problem. Hence the next theorem. Theorem 3.3. Every adapted right (or left) continuous, adapted process is progressively measurable. Proof We consider the process X restricted to the time interval 0;s. On this interval for each n2N we de ne n 2 1 X n n n n X := 1 (t)X (); (ks=2 ;(k+1)s=2 ks=2 1 k=0 n 2 X n X := 1 n (t)X () + 1 n n (t)X n() 0;s=2 ) 0 ks=2 ;(k+1)s=2 ) (k+1)s=2 2 k=1 n Note that X is a left continuous process, so if X is left continuous, working pointwise 1 n (that is, x ), the sequence X converges to X. 1 n But the individual summands in the de nition of X are by the adpatedness of X 1 n clearlyB F measurable, hence X is also. But the convergence implies X is also; 0;s s 1 hence X is progressively measurable. n Consideration of the sequenceX yields the same result for right continuous, adapted 2 processes. The following extra information about ltrations should probably be skipped on a rst reading, since they are likely to appear as excess baggage.Stochastic Processes 3 De ne _ 8t2 (0;1) F = F t s 0st 8t2 0;1) F = F ; t+ s ts1 whence it is clear that for each t,F F F . t t t+ De nition 3.2. The familyfFg is called right continuous if t 8t2 0;1) F =F : t t+ De nition 3.3. A process (X ) is said to be bounded if there exists a universal constant K such that t t0 for all and t 0, thenjX ()jK. t De nition 3.4. 0 0 Let X = (X ) be a stochastic process de ned on ( ;F;P), and let X = (X ) be a t t0 t0 t 0 stochastic process de ned on ( ;F;P). Then X and X have the same nite dimensional distributions if for all n, 0t t t 1, and A ;A ;:::;A 2E, 1 2 n 1 2 n 0 0 0 0 P(X 2A ;X 2A ;:::;X 2A ) =P (X 2A ;X 2A ;:::;X 2A ): t 1 t 2 t n 1 2 n 1 2 n t t t 1 2 n De nition 3.5. 0 0 Let X and X be de ned on ( ;F;P). Then X and X are modi cations of each other if and only if 0 P (f2 :X () =X ()g) = 1 8t 0: t t De nition 3.6. 0 0 LetX andX be de ned on ( ;F;P). ThenX andX are indistinguishable if and only if 0 P (f2 :X () =X ()8t 0g) = 1: t t There is a chain of implications indistinguishable) modi cations) same f:d:d: The following de nition provides us with a special name for a process which is indistin- guishable from the zero process. It will turn out to be important because many de nitions can only be made up to evanescence. De nition 3.7. A process X is evanescent ifP(X = 08t) = 1. t4. Martingales De nition 4.1. Let X =fX ;F ;t 0g be an integrable process then X is a t t (i) Martingale if and only ifE(XjF ) =X a.s. for 0st1 t s s (ii) Supermartingale if and only ifE(XjF )X a.s. for 0st1 t s s (iii) Submartingale if and only ifE(XjF )X a.s. for 0st1 t s s Theorem (Kolmogorov) 4.2. V Let X =fX ;F ;t 0g be an integrable process. Then de neF := F and also t t t+ t+ 0 the partial augmentation ofF byF =(F ;N ). Then if t7E(X ) is continuous there t t+ t exists anF adapted stochastic process X =fX ;F ;t 0g with sample paths which are t t t right continuous, with left limits (CADLAG) such thatX andX are modi cations of each other. De nition 4.3. 2 A martingale X =fX ;F ;t 0g is said to be an L -martingale or a square integrable t t 2 martingale ifE(X )1 for every t 0. t De nition 4.4. p p A processX =fX ;F ;t 0g is said to beL bounded if and only if sup E(jXj )1. t t t t0 2 The space of L bounded martingales is denoted byM , and the subspace of continuous 2 2 c L bounded martingales is denotedM . 2 De nition 4.5. A process X =fX ;F ;t 0g is said to be uniformly integrable if and only if t t  supE jXj1 0 as N1: t jXjN t t0 Orthogonality of Martingale Increments A frequently used property of a martingale M is the orthogonality of increments property 2 which states that for a square integrable martingaleM, andY 2F withE(Y )1 then s E Y (M M ) = 0 for ts: t s Proof Via Cauchy Schwartz inequalityEjY (M M )j1, and so t s E(Y (M M )) =E (E(Y (M M )jF )) =E (YE(M MjF )) = 0: t s t s s t s s A typical example is Y =M , whenceE(M (M M )) = 0 is obtained. A common s s t s application is to the di erence of two squares, let ts then 2 2 2 E((M M )jF ) =E(M jF ) 2M E(MjF ) +M t s s s s t s t s 2 2 2 2 =E(M M jF ) =E(M jF )M : s s t s t s 4Martingales 5 4.1. Stopping Times A random variable T : 0;1) is a stopping (optional) time if and only iff :T () tg2F . t The following theorem is included as a demonstration of checking for stopping times, and may be skipped if desired. Theorem 4.6. T is a stopping time with respect toF if and only if for allt2 0;1), the eventfT tg t+ ifF measurable. t Proof If T is anF stopping time then for all t2 (0;1) the eventfTtg isF measurable. t+ t+ Thus for 1=nt we have   1 Tt 2F +F (t1=n) t n so   1 1 fT tg = Tt 2F : t n n=1 To prove the converse, note that if for each t2 0;1) we have thatfT tg2F , t then for each such t   1 T t + 2F ; t+1=n n as a consequence of which   1 1 \ 1 fTtg = T t + 2 F =F : t+1=n t+ n n=1 n=1 T Given a stochastic process X = (X ) , a stopped process X may be de ned by t t0 T X () :=X (); T()t F :=fA2F :A\fTtg2Fg: T t Theorem (Optional Stopping). LetX be a right continuous integrable,F adapted process. Then the following are equiv- t alent: (i) X is a martingale. T (ii) X is a martingale for all stopping times T . (iii) E(X ) =E(X ) for all bounded stopping times T . T 0 (iv) E(X jF ) = X for all bounded stopping times S and T such that S  T . If in T S S addition, X is uniformly integrable then (iv) holds for all stopping times (not necessarily bounded).Martingales 6 The condition which is most often forgotten is that in (iii) that the stopping time T be bounded. To see why it is necessary considerB a Brownian Motion starting from zero. t Let T = infft 0 : X = 1g, clearly a stopping time. Equally B is a martingale with t t respect to the ltration generated byB itself, but it is also clear thatEB = 1 =6 EB = 0. T 0 Obviously in this case T 1 is false. Theorem (Doob's Martingale Inequalities).  LetM =fM ;F ;t 0g be a uniformly integrable martingale, and letM := sup jMj. t t t t0 Then (i) Maximal Inequality. For  0,   P(M )E jM j1 : 1 M 1 p (ii) L maximal inequality. For 1p1, p  kM k  kM k : p 1 p p 1 p Note that the norm used in stating the Doob L inequality is de ned by 1=p p kMk = E(jMj ) : p Theorem (Martingale Convergence). Let M =fM ;F ;t 0g be a martingale. t t p (i) If M is L bounded then M () := lim M ()P-a.s. 1 t1 t 1 (ii) If p = 1 and M is uniformly integrable then lim M () = M () in L . Then t1 t 1 1 for all A2 L (F ), there exists a martingale A such that lim A = A, and 1 t t1 t A =E(AjF ). HereF := lim F . t t 1 t1 t p p (iii) If p 1 i.e. M is L bounded lim M =M in L . t1 t 1 De nition 4.7. 2 LetM denote the set ofL -bounded CADLAG martingales i.e. martingalesM such that 2 2 supEM 1: t t0 c 2 LetM denote the set ofL -bounded CADLAG martingales which are continuous. A norm 2 2 2 2 may be de ned on the spaceM bykMk =kM k =E(M ). 2 1 2 1 2 From the conditional Jensen's inequality, since f(x) =x is convex,  2 2 E M jF  (E(M jF )) t 1 t 1  2 2 E M jF (EM ) : t t 1 Hence taking expectations 2 2 EM EM ; t 1 2 2 2 and since by martingale convergence in L , we getE(M )E(M ), it is clear that t 1 2 2 E(M ) = supE(M ): 1 t t0Martingales 7 Theorem 4.8. The space (M ;kk) (up to equivalence classes de ned by modi cations) is a Hilbert space, 2 c withM a closed subspace. 2 Proof We prove this by showing a one to one correspondence betweenM (the space of square 2 2 integrable martingales) and L (F ). The bijection is obtained via 1 2 f :M L (F ) 2 1 f :(M ) 7M  lim M t t0 1 t t1 2 g :L (F )M 1 2 g :M 7M E(M jF ) 1 t 1 t Notice that 2 2 2 supEM =kM k =E(M )1; 1 t 2 1 t 2 as M is a square integrable martingale. As L (F ) is a Hilbert space,M inherits this t 1 2 structure. c (n) To see thatM is a closed subspace ofM , consider a Cauchy sequencefM g in 2 2 (n) (n) 2 M , equivalentlyfM g is Cauchy inL (F ). HenceM converges to a limit,M say, 2 1 1 1 1 2 in L (F ). Let M :=E(M jF ), then 1 t 1 t (n) 2 sup M M 0; in L ; t t t0 (n) 2 that is M M uniformly in L . Hence there exists a subsequence n(k) such that n(k) c M M uniformly; as a uniform limit of continuous functions is continuous,M2M . 2 c ThusM is a closed subspace ofM. 25. Basics 5.1. Local Martingales A martingale has already been de ned, but a weaker de nition will prove useful for stochas- tic calculus. Note that I'll often drop references to the ltrationF , but this nevertheless t forms an essential part of the (local) martingale. Just before we dive in and de ne a Local Martingale, maybe we should pause and consider the reason for considering them. The important property of local martingales will only be seen later in the notes; and as we frequently see in this subject it is one of stability that is, they are a class of objects which are closed under an operation, in this case under the stochastic integral an integral of a previsible process with a local martingale integrator is a local martingale. De nition 5.1. M =fM ;F ; 0 t1g is a local martingale if and only if there exists a sequence of t t T n stopping times T tending to in nity such that M are martingales for all n. The space n of local martingales is denotesM , and the subspace of continuous local martingales is loc c denotesM . loc Recall that a martingale (X ) is said to be bounded if there exists a universal t t0 constant K such that for all and t 0, thenjX ()jK. t Theorem 5.2. Every bounded local martingale is a martingale. Proof Let T be a sequence of stopping times as in the de nition of a local martingale. This n T n sequence tends to in nity, so pointwiseX ()X (). Using the conditional form of the t t dominated convergence theorem (using the constant bound as the dominating function), for ts 0 T n lim E(X jF ) =E(XjF ): s t s t n1 T T n T n n But as X is a (genuine) martingale,E(X jF ) =X =X ; so s T s t s n T T n n E(XjF ) = lim E(X jF ) = lim X =X : t s s s t s n1 n1 Hence X is a genuine martingale. t Proposition 5.3. The following are equivalent (i) M =fM ;F ; 0t1g is a continuous martingale. t t (ii) M =fM ;F ; 0 t1g is a continuous local martingale and for all t 0, the set t t fM :T a stopping time; Ttg is uniformly integrable. T Proof (i)) (ii) By optional stopping theorem, if T t then M =E(MjF ) hence the set is T t T uniformly integrable. 8Basics 9 (ii)) (i)It is required to prove thatE(M ) =E(M ) for any bounded stopping time T . 0 T Then by local martingale property for any n, E(M ) =E(M ); 0 TT n uniform integrability then implies that lim E(M ) =E(M ): TT T n n1 5.2. Local Martingales which are not Martingales There do exist local martingales which are not themselves martingales. The following is an example Let B be ad dimensional Brownian Motion starting from x. It can be shown t using It o's formula that a harmonic function of a Brownian motion is a local martingale (this is on the example sheet). From standard PDE theory it is known that for d 3, the function 1 f(x) = d2 jxj d2 p is a harmonic function, hence X = 1=jBj is a local martingale. Now consider the L t t norm of this local martingale   Z 2 1 jyxj p (d2)p E jXj = exp jyj dy: x t d=2 (2t) 2t Consider when this integral converges. There are no divergence problems forjyj large, the potential problem lies in the vicinity of the origin. Here the term   2 1 jyxj exp d=2 2t (2t) is bounded, so we only need to consider the remainder of the integrand integrated over a ball of unit radius about the origin which is bounded by Z (d2)p C jyj dy; B(0;1) for some constantC, which on tranformation into polar co-ordinates yields a bound of the form Z 1 0 (d2)p d1 C r r dr; 0 0 with C another constant. This is nite if and only if(d 2)p + (d 1)1 (standard k integrals of the form 1=r ). This in turn requires thatpd=(d 2). So clealryE jXj will x t be nite for all d 3. Now althoughE jXj1 and X is a local martingale, we shall show that it is not x t t p a martingale. Note that (B x) has the same distribution as t(B x) underP (the t 1 xBasics 10 probability measure induced by the BM starting from x). So as t1,jBj1 in t probability and X 0 in probability. As X  0, we see that E (X ) = E jXj 1. t t x t x t Now note that for any R1, we can construct a bound Z 1 (d2) (d2) E X  jyj dy +R ; x t d=2 (2t) jyjR which converges, and hence (d2) lim supE X R : x t t1 (d2) As R was chosen arbitrarily we see that E X 0. But E X =jxj 0, which x t x 0 implies thatE X is not constant, and hence X is not a martingale. x t t6. Total Variation and the Stieltjes Integral Let A : 0;1)R be a CADLAG (continuous to right, with left limits) process. Let a partition  =ft ;t ;:::;t g have 0 =t t t =t; the mesh of the partition is 0 1 m 0 1 m de ned by () = max jt t j: k k1 1km The variation of A is then de ned as the increasing process V given by, 8 9 n() = X V := sup A A : 0 =t t t =t : t t t t t 0 1 n k k1 : ;  k=1 An alternative de nition is given by n() X 0 V := lim A n A n : k2 t (k1)2 t t n1 1 These can be shown to be equivalent (for CADLAG processes), since trivially (use the 0 0 dyadic partition), V V . It is also possible to show that V V for the total variation t t t t of a CADLAG process. De nition 6.1. A process A is said to have nite variation if the associated variation process V is nite (i.e. if for every t and every ,jV ()j1. t 6.1. Why we need a Stochastic Integral Before delving into the depths of the integral it's worth stepping back for a moment to see why the `ordinary' integral cannot be used on a path at a time basis (i.e. separately for each 2 ). Suppose we were to do this i.e. set Z t I (X) = X ()dM (); t s s 0 c for M2M ; but for an interesting martingale (i.e. one which isn't zero a.s.), the total 2 variation is not nite, even on a bounded interval like 0;T . Thus the Lebesgue-Stieltjes integral de nition isn't valid in this case. To generalise we shall see that the quadratic variation is actually the `right' variation to use (higher variations turn out to be zero and lower ones in nite, which is easy to prove by considering the variation expressed as the limit of a sum and factoring it by a maximum multiplies by the quadratic variation, the rst term of which tends to zero by continuity). But to start, we shall consider integrating a previsible process H with an integrator which is an increasing nite variation process. t First we shall prove that a continuous local martingale of nite variation is zero. 11Total Variation and the Stieltjes Integral 12 Proposition 6.2. If M is a continuous local martingale of nite variation, starting from zero then M is identically zero. Proof LetV be the variation process ofM. ThisV is a continuous, adapted process. Now de ne a sequence of stopping timesS as the rst timeV exceedsn, i.e.S := infft 0 :V ng. n n t t S n Then the martingale M is of bounded variation. It therefore suces to prove the result for a bounded, continuous martingale M of bounded variation. Fixt 0 and letf0 =t ;t ;:::;t =tg be a partition of 0;t. Then sinceM = 0 it is 0 1 N 0   P N 2 2 2 clear that, M = M M . Then via orthogonality of martingale increments t k=1 t t k k1 N X  2 2 E(M ) =E M M t t t k k1 k=1   E V sup M M t t t k k1 k 2 The integrand is bounded by n (from de nition of the stopping time S ), hence the n expectation converges to zero as the modulus of the partition tends to zero by the bounded convergence theorem. Hence M 0. 6.2. Previsibility The term previsible has crept into the discussion earlier. Now is the time for a proper de nition. De nition 6.3. + The previsible (or predictable)- eldP is the- eld onR  generated by the processes (X ) , adapted toF , with left continuous paths on (0;1). t t0 t Remark The same - eld is generated by left continuous, right limits processes (i.e. c agl ad pro- cesses) which are adapted toF , or indeed continuous processes (X ) which are adapted t t t0 toF . It is gnerated by sets of the form A (s;t whereA2F . It should be noted that t s c adl ag processes generate the optional  eld which is usually di erent. Theorem 6.4. The previsible  eldis also generated by the collection of random sets Af0g where A2F and A (s;t where A2F . 0 s Proof 0 Let the  eld generated by the above collection of sets be denotesP . We shall show 0 P =P . Let X be a left continuous process, de ne for n2N X n n n n X =X 1 (t) + X 1 (t) 0 0 k=2 (k=2 ;(k+1)=2 k 0 It is clear that X 2P . As X is left continuous, the above sequence of left-continuous n 0 0 processes converges pointwise to X, so X isP measurable, thusP  P . ConverselyTotal Variation and the Stieltjes Integral 13 consider the indicator function of A (s;t this can be written as 1 , where 0;t n0;s A A s () =s for 2A and +1 otherwise. These indicator functions are adapated and left A 0 continuous, henceP P. De nition 6.5. A process (X ) is said to be previsible, if the mapping (t;)7 X () is measurable t t0 t with respect to the previsible - eldP. 6.3. Lebesgue-Stieltjes Integral In the lecture notes for this course, the Lebesgue-Stieltjes integral is considered rst for functions A and H; here I consider processes on a pathwise basis. Let A be an increasing cadlag process. This induces a Borel measure dA on (0;1) such that dA((s;t)() =A ()A (): t s Let H be a previsible process (as de ned above). The Lebesgue-Stieltjes integral of H is de ned with respect to an increasing process A by Z t (HA) () = H ()dA (); t s s 0 whenever H 0 or (jHjA) 1. t As a notational aside, we shall write Z t (HA)  HdX; t 0 and later on we shall use d(HX)HdX: This de nition may be extended to integrator of nite variation which are not increas- ing, by decomposing the process A of nite variation into a di erence of two increasing +  processes, so A =A A , where A = (VA)=2 (here V is the total variation process for A). The integral of H with respect to the nite variation process A is then de ned by + (HA) () := (HA ) () (HA ) (); t t t whenever (jHjV ) 1. t There are no really new concepts of the integral in the foregoing; it is basically the Lebesgue-Stieltjes integral eextended from functionsH(t) to processes in a pathwise fashion (that's why has been included in those de nitions as a reminder). Theorem 6.6. If X is a non-negative continuous local martingale and E(X ) 1 then X is a super- 0 t martingale. If additionally X has constant mean, i.e.E(X ) =E(X ) for allt thenX is a t 0 t martingale.Total Variation and the Stieltjes Integral 14 Proof As X is a continuous local martingale there is a sequence of stopping times T "1 such t n T n that X is a genuine martingale. From this martingale property T T n n E(X jF ) =X : s t s As X  0 we can apply the conditional form of Fatou's lemma, so t T T T n n n E(XjF ) =E(lim infX jF ) lim infE(X jF ) = lim infX =X : t s s s s t t s n1 n1 n1 HenceE(XjF )X , so X is a supermartingale. t s s t Given the constant mean propertyE(X ) =E(X ). Let t s A :=f :X E(XjF ) 1=ng; n s t s so 1 A := A =f :X E(XjF ) 0g: n s t s n=1 P 1 1 ConsiderP(A) =P( A ) P(A ). Suppose for some n,P(A ) , then note n n n n=1 n=1 that 2A : X E(XjF ) 1=n n s t s 2 =A : X E(XjF ) 0 n s t s Hence 1 X E(XjF ) 1 ; s t s A n n taking expectations yields  E(X )E(X ) ; s t n but by the constant mean property the left hand side is zero; hence a contradiction, thus all theP(A ) are zero, so n X =E(XjF ) a:s: s t s7. The Integral We would like eventually to extend the de nition of the integral to integrands which are previsible processes and integrators which are semimartingales (to be de ned later in these notes). In fact in these notes we'll only get as far as continuous semimartingales; but it is possible to go the whole way and de ne the integral of a previsible process with respect to a general semimartingale; but some extra problems are thrown up on the way, in particular as regards the construction of the quadratic variation process of a discontinuous process. Various special classes of process will be needed in the sequel and these are all de ned here for convenience. Naturally with terms like `elementary' and `simple' occurring many books have di erent names for the same concepts so beware 7.1. Elementary Processes An elementary process H () is one of the form t H () =Z()1 (t); t (S();T() where S;T are stopping times, S T1, and Z is a boundedF measurable random S variable. Such a process is the simplest non-trivial example of a previsible process. Let's prove that it is previsible: H is clearly a left continuous process, so we need only show that it is adapted. It can be considered as the pointwise limit of a sequence of right continuous processes 1 1 H (t) = lim Z1 ; S =S + ; T =T + : n S ;T ) n n n n n1 n n So it is sucient to show thatZ1 is adapted whenU andV are stopping times which U;V ) satisfy U V , and Z is a boundedF measurable function. Let B be a borel set ofR, U then the event fZ1 (t)2Bg = fZ2Bg\fUtg\fV tg: U;V ) By the de nition ofU as a stopping time and hence the de nition ofF , the event enclosed U by square brackets is inF , and since V is a stopping timefV tg = =fV tg is also t inF ; hence Z1 is adapted. t U;V ) 7.2. Strictly Simple and Simple Processes  A processH is strictly simple (H2L ) if there exist 0t t 1 and uniformly 0 n boundedF measurable random variables Z such that t k k n1 X H =H ()1 (t) Z ()1 : 0 0 k (t ;t (t) k k+1 k=0 15The Integral 16 This can be extended to H is a simple processes (H2L), if there exists a sequence of stopping times 0 T  T 1, and Z uniformly boundedF measurable 0 k k T k random variables such that 1 X H =H ()1 (t) + Z 1 : 0 0 k (T ;T k k+1 k=0 Similarly a simple process is also a previsible process. The fundamental result will follow from the fact that the -algebra generated by the simple processes is exactly the previsible -algebra. We shall see the application of this after the next section.

Advise: Why You Wasting Money in Costly SEO Tools, Use World's Best Free SEO Tool Ubersuggest.