## Some Basic Definitions (Stochastic Processes Introduction)

**Experiment:** Any activity or situation having uncertain outcome.

**Sample Space:** The set of all possible outcomes is called sample space and every element *ω* of *Ω* is called sample point. In Stochastic process we will call it as state space.

**Event and Event Space:** An event is a subset of the sample space. The class of all events associated with a given experiment is defined to be the event space.

An event will always be a subset of the sample space, but for sufficiently large sample spaces, not all subsets will be events. Thus the class of all subsets of the sample space will not necessarily correspond to the event space.

**Random Variable:**

A random variable is a mapping function which assigns outcomes of a random experiment to real numbers. Occurrence of the outcome follows certain probability distribution. Therefore, a random variable is completely characterized by its probability density function (PDF). Or

A random variable is a map *X: Ω→R* such that *{X ≤ x}={ω ε Ω : x(ω) ≤ x}ε F* for all *x ε R*.

**Probability Space:** A probability space consists $(\Omega, \mathfrak{F}, P)$ of three parts, sample space, a collection of events and a probability measure.

**Cumulative Distribution Function (CDF):** Probability distribution function for the random variable *X* such that *F(a)=P{X ≤ a }*

**Time:** A point of time either discrete or continuous

**State:** It describe the attribute of a system at some point in time *S=(s _{1},s_{2},…,s_{t})*

It is convenient to assign some unique non-negative integer as index to each possible value of the state vector *S*.

**Activity:** Something that takes some amount of time (duration) to occur. Activity culminates in an event.

**Transition:** Transition is caused by an event and it results in some movement from one state to another state.

**Probability Measure:** A probability measure intends to be a function defined for all subsets of Ω.

**What is Stochastic Process**?

The word stochastic is derived from the Greek word “stoΩ’kæstIk” meaning “to aim at a target”. Stochastic processes involves state which changes in a random way.

Given a **probability space** $(\Omega, \mathfrak{F}, P)$ stochastic process *{X(t), t ε T}* is a family of random variables, where the index set *T* may be discrete (*T*={0,1,2,…}) or continuous (*T*=[0, ∞)). The set of possible values which random variables *{X(t), t ε T}* may assume is called the **state space** of the process, and denoted by *S*. A continuous time stochastic process *{X(t), t ε T}*; (*T*=[0,∞)) is said to have independent increment of for all choices of *{t _{1},t_{2},…,t_{n}}*, the

*n*random variables

*X(t*

_{1})-*X(t*, X(t

_{0})_{2})-

*X(t*,…,X(t

_{1})_{n})-

*X(t*are independent. Using discrete time the state of the process at time

_{n-1})*n+1*depends only on its state at time

*n*.

It is often used to represent the evolution of some random value or system over time. Examples of processes modeled as stochastic time series include stock market and exchange rate fluctuations, signals such as speech, audio and video, medical data such as a patient’s EKG, EEG, blood pressure or temperature, random movement such as Brownian motion or random walks, counting process, Renewal process, Poisson process and Markov process.

**Download file:**

**Introduction to Stochastic Processes 117.43 KB**