Stochastic Processes - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Stochastic Processes

Description:

Lemma 1.2: Our definition of the Markov property (discrete time) is equivalent to ... Lemma 1.5: is the optimal estimator of X based on Y1,..Yn in the sense that for ... – PowerPoint PPT presentation

Number of Views:1181
Avg rating:3.0/5.0
Slides: 29
Provided by: friendspro
Category:

less

Transcript and Presenter's Notes

Title: Stochastic Processes


1
Stochastic Processes
  • Shane Whelan
  • L551

2
Review of Last 3 Lectures Chapter 0
  • Real ModellingNot Mathematics
  • Classifying Models
  • Components of Model
  • Building a Model 10 Helpful Steps
  • Advantages of Modelling
  • Drawbacks of Modelling (that must be guarded
    against)
  • Key points to assess the suitability of a model.
  • Some further considerations in modelling
    example/exercise on correlations.
  • Case Study Lessons from econometric modelling in
    UK over last 4 decades.

3
Next 2½ Lectures Part I of Chapter 1
  • Basic terminology
  • Stochastic process sample path m-increment,
    stationary increment.
  • Foundational concepts
  • Stationary process weak stationarity Markov
    property martingale discrete time stopping
    time
  • Some elementary examples
  • White noise random walk moving average (MA).
  • Some important practical examples
  • Poisson process compound Poisson process
    Brownian motion (or Wiener Process).

4
Chapter 1
  • Basic terminology Foundational concepts of
    Stochastic Processes

5
Definition of Stochastic Process
  • Definition A stochastic process is a sequence or
    continuum of random variables indexed by an
    ordered set T.
  • Generally, of course, T records time.
  • A stochastic process is often denoted Xt, t?T.
    I prefer ltXtgt, t?T, so as to avoid confusion with
    the state space.
  • Examples
  • Quick Question with Surprising Answer Let ltXtgt,
    t?Z such that ltXtgt is iid with EXt0 and
    EXt2lt?. Prove that the correlation between Xt
    and Xt-1 is -(2)-½.

6
Defining a Given Stochastic Process
  • Defining (or wholly understanding) ltXtgt, for all
    t?T amounts to defining the joint distribution
    Xt1, Xt2,,Xtn for all t and all n.
  • Not easy to do and very cumbersome.
  • But generally use indirect means, e.g., by
    defining the transition process.
  • Sample path of process is a joint realisation of
    the random variables Xt, for all t?T.
  • Sample path is a function from T to state space
  • Each sample path has an associated probability.

7
2-Dimensional Distribution
8
Stationarity
  • Definition A stochastic process is said to be
    stationary if the joint distributions of Xt1,
    Xt2,,Xtn and Xk1, Xk2,,Xkn are the same
    for all t, k and all n.
  • Hence statistical properties unaffected by a time
    shift.
  • In particular, Xt and Xk have the same
    distribution
  • A stringent requirement, difficult to test
  • The assumption of stationarity sweats the data
    allows max. use of available data.
  • Is the stochastic process of life stationary?
  • Try to think of a stationary process which is not
    iid.

9
Weak Stationarity
  • Definition A stochastic process is said to be
    weakly stationary if
  • EXtEXk for all t and k.
  • CovXt , Xtm is a function only of m, for all
    t and m.
  • Remarks
  • Strong stationarity implies weak stationarity.
  • Concept used extensively in time series analysis
  • Remark Weak stationarity is not a foundational
    concept it says little enough about the
    underlying distribution and relationship
    structure. It is more practical, though.

10
Increments
  • Consider Xtm Xt . This is known as an
    m-increment of the process.
  • Often defining how the process evolves through
    time is easier to get a handle onand a more
    natural description of the process (e.g.,
    evolution, many games, etc.)
  • A process is said to have independent increments
    if Xtm Xt is independent of the past of the
    process for all t and m.
  • A process is said to have stationary increments
    if the increments have the same distribution.

11
The Markov Property
  • When the future evolution of the system depends
    only on its current state it is not affected by
    the past the system has the Markov property.
  • Definition Let ltXtgt, t? ? (the natural numbers)
    be a (discrete time) stochastic process. Then
    ltXtgt, is said to have the Markov property if, ?t
  • PXt1 Xt, Xt-1,Xt-2,,X0PXt1 Xt.
  • Definition Let ltXtgt, t? ? (the real numbers) be
    a (continuous time) stochastic process. Then
    ltXtgt, is said to have the Markov property if, ?t,
    and all sets A
  • PXt?A Xs1x1, Xs2x2,,XsxPXt?AXsx
  • Where s1lts2ltltsltt.

12
Markov Processes
  • Definition A stochastic process that has the
    Markov property is known as a Markov process.
  • If state space and time is discrete then process
    known as Markov chain (see Chapter 2).
  • When state space is discrete but time is
    continuous then known as Markov jump process (see
    Chapter 3).

13
To Prove
  • Lemma 1.1 A process with independent increments
    has the Markov Property.
  • Proof On Board
  • Lemma 1.2 Our definition of the Markov property
    (discrete time) is equivalent to
  • PXt1 Xs, Xs-1,Xs-2,,X0PXt1 Xs, where
    s?t.
  • Proof On Board

14
Examples of Stochastic Processes
  • Discrete White Noise
  • A sequence of independent identically distributed
    random variables, Z0, Z1,Z2,
  • Important sub-classifications include zero-mean
    white noise, i.e., EZi0 symmetric white
    noise etc. have the obvious meaning.
  • General random walk
  • Let Z0, Z1,Z2,be white noise and define
  • Xn?nZt, with X0x0. Then ltXngt is a random walk.
  • It is a discrete time Markov process that is not
    weakly stationary.
  • When Zt can only take values ?1 then process
    known as a simple random walk. Generally we set
    X00.

15
More Special Processes MA(p)
  • Let Z1,Z2,Z3, be white noise and let ?i be real
    numbers. Then Xn is a moving average process of
    order p iff
  • Note process is stationary but not iid.
  • Moving average processes are stationary but not,
    in general, Markovian.

16
Poisson Process
  • Definition A Poisson process with rate ? is a
    continuous-time process Nt, t?0 such that
  • N00
  • ltNtgt has independent increments
  • ltNtgt has Poisson distributed increments, i.e.,
  • where n??

17
Remarks on Poisson Process
  • Poisson Process is a Markov jump process, i.e.,
    Markovian with a discrete state space in
    continuous time.
  • It is not even weakly stationary.
  • Think of it as the stochastic generalisation of
    the deterministic natural numbers stochastic
    counting.
  • A central process in insurance and finance due to
    role as the the natural stochastic counting
    process, e.g., number of claims.

18
Compound Poisson Process
  • Definition Let ltNtgt be a Poisson process and
    let Z1,Z2,Z3,be white noise. Then Xt is said to
    be a compound Poisson process where
  • With convention when Nt0 then Xt0.

19
Remarks on Compound Poisson Process
  • We are stochastically counting incidences of an
    event with a stochastic payoff.
  • Markov property holds.
  • Important as model for cumulative claims on
    insurance companythe Cramér-Lundberg model
    after Lundbergs Uppsala thesis of 1903the
    basis of classical risk theory
  • Key problem in classical risk theory is
    estimating the probability of ruin,
  • i.e., ? s.t. ?(u)Puct-Xtlt0, for some tgt0.

20
Brownian Motion (or Wiener Process)
  • Definition Brownian motion, Bt, t?0, is a
    stochastic process with state space ? (the real
    line) such that
  • B00
  • Bt has independent increments
  • Bt-Bs is distributed N(?(t-s), ?2(t-s))
  • Bt has continuous sample paths.

21
Remarks on Brownian Motion
  • Guassian Normal
  • ? is known as the drift.
  • Standard Brownian motion is when B00, ?0, and
    ?21.
  • Sample paths have no jumps.
  • This is the continuous time analogue of a random
    walk (as well see in Semester 2).
  • By CLT, Bt is the limiting continuous stochastic
    process for a wide class of discrete time
    processes.
  • Simpler definition Brownian motion is a
    continuous process with independent Guassian
    increments.

22
Question 1-A
  • Let Xt be a simple random walk with prob. of an
    upward move given by p. Calculate
  • P(X22,X53X00)
  • P(X20, X42X00)
  • Is the random walk stationary?
  • What is the joint distribution of X2,X4, given
    X00
  • Prove that Xt has the Markov property

23
Martingales Stopping Times
  • (Part II of Chapter 1)

24
Martingales in Discrete Time
  • A discrete time stochastic process Xt , t?0, is
    said to be a martingale if
  • EXtlt? for all t.
  • EXnX0,,Xm-1, XmXm for all mltn
  • Explanation the current value Xm is the optimal
    estimator of all future values. All known
    information by time m on the future of the
    process is factored into Xm
  • A generalisation of the notion of a fair game.
  • Useful concept in probability theory as many
    important limit theorems can be proved for
    martingales.
  • The building block of much of capital market
    theory.

25
Conditional Expectation Recap
26
Conditional Expectation Key Properties
  • Property of iterative expectations,
  • EEXYEX
  • which generalises to EEXY1,Y2,,YnEX.
  • If X is a constant then EX X
  • If C is a constant then ECXCEX
  • If X is independent of Y then EX?YEX

27
Simple Property of Martingales
  • Lemma 1.3 If ltXtgt is a martingale then
  • EXtEX0
  • Proof Immediate.

28
Two Lemmas
  • Lemma 1.4 For every function, f(.),
  • Lemma 1.5 is the optimal
    estimator of X based on Y1,..Yn in the sense that
    for every function f(.),
Write a Comment
User Comments (0)
About PowerShow.com